adds basic support for remote schemas/schema stitching (#952)
1
.gitignore
vendored
@ -3,3 +3,4 @@ npm-debug.log
|
||||
*.DS_Store
|
||||
.tern-project
|
||||
test-server-output
|
||||
.vscode
|
||||
|
19
README.md
@ -8,7 +8,7 @@
|
||||
<a href="https://twitter.com/intent/follow?screen_name=HasuraHQ"><img src="https://img.shields.io/badge/Follow-HasuraHQ-blue.svg?style=flat&logo=twitter"></a>
|
||||
<a href="https://eepurl.com/dBUfJ5"><img src="https://img.shields.io/badge/newsletter-subscribe-yellow.svg?style=flat"></a>
|
||||
|
||||
Hasura GraphQL Engine is a blazing-fast GraphQL server that gives you **instant, realtime GraphQL APIs over Postgres**, with [**webhook triggers**](event-triggers.md) on database events for asynchronous business logic.
|
||||
Hasura GraphQL Engine is a blazing-fast GraphQL server that gives you **instant, realtime GraphQL APIs over Postgres**, with [**webhook triggers**](event-triggers.md) on database events, and [**remote schemas**](remote-schemas.md) for business logic.
|
||||
|
||||
Hasura helps you build GraphQL apps backed by Postgres or incrementally move to GraphQL for existing applications using Postgres.
|
||||
|
||||
@ -28,6 +28,7 @@ Read more at [hasura.io](https://hasura.io) and the [docs](https://docs.hasura.i
|
||||
|
||||
* **Make powerful queries**: Built-in filtering, pagination, pattern search, bulk insert, update, delete mutations
|
||||
* **Realtime**: Convert any GraphQL query to a live query by using subscriptions
|
||||
* **Merge remote schemas**: Access custom GraphQL schemas for business logic via a single GraphQL Engine endpoint. [**Read more**](remote-schemas.md).
|
||||
* **Trigger webhooks or serverless functions**: On Postgres insert/update/delete events ([read more](event-triggers.md))
|
||||
* **Works with existing, live databases**: Point it to an existing Postgres database to instantly get a ready-to-use GraphQL API
|
||||
* **Fine-grained access control**: Dynamic access control that integrates with your auth system (eg: auth0, firebase-auth)
|
||||
@ -47,7 +48,7 @@ Read more at [https://hasura.io](https://hasura.io) and the [docs](https://docs.
|
||||
- [Architecture](#architecture)
|
||||
- [Client-side tooling](#client-side-tooling)
|
||||
- [Add business logic](#add-business-logic)
|
||||
- [Custom resolvers](#custom-resolvers)
|
||||
- [Remote schemas](#remote-schemas)
|
||||
- [Trigger webhooks on database events](#trigger-webhooks-on-database-events)
|
||||
- [Demos](#demos)
|
||||
- [Realtime applications](#realtime-applications)
|
||||
@ -87,7 +88,7 @@ guides](https://docs.hasura.io/1.0/graphql/manual/getting-started/index.html) or
|
||||
|
||||
The Hasura GraphQL Engine fronts a Postgres database instance and can accept GraphQL requests from your client apps. It can be configured to work with your existing auth system and can handle access control using field-level rules with dynamic variables from your auth system.
|
||||
|
||||
You can also place the engine behind a central GraphQL proxy that fronts multiple GraphQL APIs via schema stitching.
|
||||
You can also merge remote GraphQL schemas and provide a unified GraphQL API.
|
||||
|
||||
![Hasura GraphQL Engine architecture](assets/hasura-arch.svg)
|
||||
|
||||
@ -97,11 +98,11 @@ Hasura works with any GraphQL client. We recommend using [Apollo Client](https:/
|
||||
|
||||
## Add business logic
|
||||
|
||||
### Custom resolvers
|
||||
GraphQL Engine provides easy-to-reason, scalable and performant methods for adding custom business logic to your backend:
|
||||
|
||||
Add custom resolvers in addition to Hasura GraphQL engine. Ideal for delegating
|
||||
to HTTP APIs, making direct calls to another data-source or writing business
|
||||
logic in code - [read more](community/boilerplates/custom-resolvers).
|
||||
### Remote schemas
|
||||
|
||||
Add custom resolvers in a remote schema in addition to Hasura's Postgres-based GraphQL schema. Ideal for use-cases like implementing a payment API, or querying data that is not in your database - [read more](remote-schemas.md).
|
||||
|
||||
### Trigger webhooks on database events
|
||||
|
||||
@ -109,6 +110,10 @@ Add asynchronous business logic that is triggered based on database events.
|
||||
Ideal for notifications, data-pipelines from Postgres or asynchronous
|
||||
processing - [read more](event-triggers.md).
|
||||
|
||||
### Derived data or data transformations
|
||||
|
||||
Transform data in Postgres or run business logic on it to derive another dataset that can be queried using GraphQL Engine - [read more](https://docs.hasura.io/1.0/graphql/manual/business-logic/index.html).
|
||||
|
||||
## Demos
|
||||
|
||||
Check out all the example applications in the
|
||||
|
Before Width: | Height: | Size: 146 KiB After Width: | Height: | Size: 172 KiB |
BIN
assets/remote-schemas-arch.png
Normal file
After Width: | Height: | Size: 67 KiB |
@ -29,6 +29,7 @@ var ravenVersions = []mt.Version{
|
||||
|
||||
var testMetadata = map[string][]byte{
|
||||
"metadata": []byte(`query_templates: []
|
||||
remote_schemas: []
|
||||
tables:
|
||||
- array_relationships: []
|
||||
delete_permissions: []
|
||||
@ -40,6 +41,7 @@ tables:
|
||||
update_permissions: []
|
||||
`),
|
||||
"empty-metadata": []byte(`query_templates: []
|
||||
remote_schemas: []
|
||||
tables: []
|
||||
`),
|
||||
}
|
||||
|
@ -1,6 +1,9 @@
|
||||
# GraphQL Custom Resolver Example
|
||||
|
||||
This is a simple example of using a custom resolver with Hasura's GraphQL API.
|
||||
> **NOTE**: now merge [Remote Schemas](../../../remote-schemas.md) from [GraphQL servers](../graphql-servers) using Hasura
|
||||
> - Boilerplates for custom GraphQL servers have been moved [here](../graphql-servers). Also, a recently released feature removes the need for an external GraphQL gateway by letting you merge remote schemas in GraphQL Engine itself - [read more](../../../remote-schemas.md) (*Please check caveats for current limitations in the feature*).
|
||||
> - Once schemas have been merged in GraphQL Engine, Hasura proxies requests to remote GraphQL servers.
|
||||
> - Adding another layer in front of GraphQL Engine impacts performance by as much as **4X**, due the serialization-deserialization overhead. Using an external GraphQL gateway is recommended only if your use case is blocked on any of the current limitations.
|
||||
|
||||
## Motivation
|
||||
|
||||
|
15
community/boilerplates/graphql-servers/README.md
Normal file
@ -0,0 +1,15 @@
|
||||
# GraphQL Server Boilerplates
|
||||
|
||||
Hasura GraphQL Engine can combine schemas from multiple remote GraphQL servers
|
||||
and expose them at a single endpoint. You can write these GraphQL servers in any
|
||||
language and Hasura takes care of stitching together the schema from these
|
||||
servers ([read more](../../../remote-schemas.md)).
|
||||
|
||||
This directory contains boilerplates for writing GraphQL servers using various
|
||||
languages and frameworks.
|
||||
|
||||
- [Docs on Remote Schemas](https://docs.hasura.io/1.0/graphql/manual/remote-schemas/index.html)
|
||||
|
||||
## Architecture
|
||||
|
||||
![Remote schema architecture diagram](../../../assets/remote-schemas-arch.png)
|
2
community/boilerplates/graphql-servers/nodejs-apollo/.gitignore
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
node_modules
|
||||
package-lock.json
|
@ -0,0 +1,11 @@
|
||||
FROM node:8
|
||||
|
||||
WORKDIR /server
|
||||
|
||||
COPY ./package.json /server/
|
||||
|
||||
RUN npm install
|
||||
|
||||
COPY . /server/
|
||||
|
||||
CMD ["npm", "start"]
|
@ -0,0 +1,57 @@
|
||||
# GraphQL server using NodeJS and Apollo
|
||||
|
||||
A boilerplate Python GraphQL Server using NodeJS and [Apollo Server](https://www.apollographql.com/docs/apollo-server/)
|
||||
|
||||
## Deploying
|
||||
|
||||
Clone the repo:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/hasura/graphql-engine
|
||||
cd graphql-engine/community/boilerplates/graphql-servers/nodejs-apollo
|
||||
```
|
||||
|
||||
### Using Zeit Now
|
||||
|
||||
Install the [Zeit Now](https://zeit.co/now) CLI:
|
||||
|
||||
```bash
|
||||
npm install -g now
|
||||
```
|
||||
|
||||
Deploy the server:
|
||||
```bash
|
||||
now
|
||||
```
|
||||
|
||||
Get the URL and make a sample query:
|
||||
```bash
|
||||
curl https://app-name-something.now.sh/graphql \
|
||||
-H 'Content-Type:application/json' \
|
||||
-d'{"query":"{ hello }"}'
|
||||
|
||||
{"data":{"hello":"Hello World!"}}
|
||||
```
|
||||
|
||||
You can also visit the now url to open GraphiQL.
|
||||
|
||||
## Running locally
|
||||
Running the server locally:
|
||||
|
||||
```bash
|
||||
npm install
|
||||
npm start
|
||||
```
|
||||
|
||||
Running the server using Docker:
|
||||
|
||||
```bash
|
||||
docker build -t nodejs-apollo-graphql .
|
||||
docker run -p 4000:4000 nodejs-apollo-graphql
|
||||
```
|
||||
|
||||
GraphQL endpoint will be `http://localhost:4000/graphql`.
|
||||
|
||||
**Note**: When GraphQL Engine is running in a Docker container, `localhost` will
|
||||
point to the containers local interface, not the host's interface. You might
|
||||
have to use the host's docker host IP or a specific DNS label based on your OS.
|
@ -0,0 +1,22 @@
|
||||
{
|
||||
"name": "nodejs-apollo-gql-server",
|
||||
"version": "1.0.0",
|
||||
"description": "A GraphQL server boilerplate written in NodeJS using and Apollo Server",
|
||||
"main": "server.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1"
|
||||
},
|
||||
"author": "Hasura",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"apollo-server": "^2.2.1",
|
||||
"graphql": "^14.0.2",
|
||||
"graphql-tools": "^4.0.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"esm": "^3.0.84"
|
||||
},
|
||||
"scripts": {
|
||||
"start": "node -r esm server.js"
|
||||
}
|
||||
}
|
@ -0,0 +1,50 @@
|
||||
const { ApolloServer } = require('apollo-server');
|
||||
const { makeExecutableSchema } = require('graphql-tools');
|
||||
|
||||
const port = process.env.PORT || 3000;
|
||||
|
||||
let count = 0;
|
||||
|
||||
const typeDefs = `
|
||||
type Query {
|
||||
hello: String!
|
||||
count: Int!
|
||||
}
|
||||
|
||||
type Mutation {
|
||||
increment_counter: count_mutation_response!
|
||||
}
|
||||
|
||||
type count_mutation_response {
|
||||
new_count: Int!
|
||||
}
|
||||
`;
|
||||
|
||||
const resolvers = {
|
||||
Query: {
|
||||
hello: () => {
|
||||
return "Hello World!"
|
||||
},
|
||||
count: () => {
|
||||
return count;
|
||||
}
|
||||
},
|
||||
Mutation: {
|
||||
increment_counter: () => {
|
||||
return { new_count: ++count }
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const schema = makeExecutableSchema({
|
||||
typeDefs,
|
||||
resolvers
|
||||
});
|
||||
|
||||
const server = new ApolloServer({
|
||||
schema
|
||||
});
|
||||
|
||||
server.listen({ port }).then(({url}) => {
|
||||
console.log(`GraphQL server running at ${url}`);
|
||||
});
|
2
community/boilerplates/graphql-servers/nodejs-express/.gitignore
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
node_modules
|
||||
package-lock.json
|
@ -0,0 +1,11 @@
|
||||
FROM node:8
|
||||
|
||||
WORKDIR /server
|
||||
|
||||
COPY ./package.json /server/
|
||||
|
||||
RUN npm install
|
||||
|
||||
COPY . /server/
|
||||
|
||||
CMD ["npm", "start"]
|
@ -0,0 +1,57 @@
|
||||
# GraphQL server using NodeJS-Express
|
||||
|
||||
A boilerplate Python GraphQL Server using NodeJS-Express using the official [graphql-js](https://graphql.github.io/graphql-js/running-an-express-graphql-server/) library.
|
||||
|
||||
## Deploying
|
||||
|
||||
Clone the repo:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/hasura/graphql-engine
|
||||
cd graphql-engine/community/boilerplates/graphql-servers/nodejs-express
|
||||
```
|
||||
|
||||
### Using Zeit Now
|
||||
|
||||
Install the [Zeit Now](https://zeit.co/now) CLI:
|
||||
|
||||
```bash
|
||||
npm install -g now
|
||||
```
|
||||
|
||||
Deploy the server:
|
||||
```bash
|
||||
now
|
||||
```
|
||||
|
||||
Get the URL and make a sample query:
|
||||
```bash
|
||||
curl https://app-name-something.now.sh/graphql \
|
||||
-H 'Content-Type:application/json' \
|
||||
-d'{"query":"{ hello }"}'
|
||||
|
||||
{"data":{"hello":"Hello World!"}}
|
||||
```
|
||||
|
||||
You can also visit the `/graphql` endpoint of the now url to open GraphiQL.
|
||||
|
||||
## Running locally
|
||||
Running the server locally:
|
||||
|
||||
```bash
|
||||
npm install
|
||||
npm start
|
||||
```
|
||||
|
||||
Running the server using Docker:
|
||||
|
||||
```bash
|
||||
docker build -t nodejs-express-graphql .
|
||||
docker run -p 4000:4000 nodejs-express-graphql
|
||||
```
|
||||
|
||||
GraphQL endpoint will be `http://localhost:4000/graphql`.
|
||||
|
||||
**Note**: When GraphQL Engine is running in a Docker container, `localhost` will
|
||||
point to the containers local interface, not the host's interface. You might
|
||||
have to use the host's docker host IP or a specific DNS label based on your OS.
|
@ -0,0 +1,3 @@
|
||||
{
|
||||
"version": 1
|
||||
}
|
@ -0,0 +1,20 @@
|
||||
{
|
||||
"name": "nodejs-express-gql-server",
|
||||
"version": "1.0.0",
|
||||
"description": "A GraphQL server boilerplate for NodeJS-Express using graphql-js library",
|
||||
"main": "server.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1",
|
||||
"start": "node -r esm server.js"
|
||||
},
|
||||
"author": "Hasura",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"express": "^4.16.4",
|
||||
"express-graphql": "^0.7.1",
|
||||
"graphql": "^14.0.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"esm": "^3.0.84"
|
||||
}
|
||||
}
|
@ -0,0 +1,44 @@
|
||||
const express = require('express');
|
||||
const graphqlHTTP = require('express-graphql');
|
||||
const { buildSchema } = require('graphql');
|
||||
|
||||
let count = 0;
|
||||
const port = process.env.port || 3000;
|
||||
|
||||
// Construct a schema, using GraphQL schema language
|
||||
const schema = buildSchema(`
|
||||
type Query {
|
||||
hello: String!
|
||||
count: Int!
|
||||
}
|
||||
|
||||
type Mutation {
|
||||
increment_counter: count_mutation_response!
|
||||
}
|
||||
|
||||
type count_mutation_response {
|
||||
new_count: Int!
|
||||
}
|
||||
`);
|
||||
|
||||
// The root provides a resolver function for each API endpoint
|
||||
const root = {
|
||||
hello: () => {
|
||||
return 'Hello world!';
|
||||
},
|
||||
count: () => {
|
||||
return count;
|
||||
},
|
||||
increment_counter: () => {
|
||||
return { new_count: ++count }
|
||||
}
|
||||
};
|
||||
|
||||
var app = express();
|
||||
app.use('/graphql', graphqlHTTP({
|
||||
schema: schema,
|
||||
rootValue: root,
|
||||
graphiql: true,
|
||||
}));
|
||||
app.listen(port);
|
||||
console.log(`Running a GraphQL API server at localhost:${port}/graphql`);
|
3
community/boilerplates/graphql-servers/python-flask-graphene/.gitignore
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
venv
|
||||
*.temp
|
||||
__pycache__
|
@ -0,0 +1,9 @@
|
||||
FROM python:3-alpine
|
||||
|
||||
COPY requirements.txt requirements.txt
|
||||
RUN pip install -r requirements.txt
|
||||
|
||||
WORKDIR app
|
||||
COPY server.py server.py
|
||||
|
||||
CMD ["gunicorn", "-b", "0.0.0.0:5000", "server:app"]
|
@ -0,0 +1,63 @@
|
||||
# GraphQL server using python-flask-graphene
|
||||
|
||||
A boilerplate Python GraphQL Server using
|
||||
[Flask](https://github.com/graphql-python/flask-graphql) and
|
||||
[Graphene](https://github.com/graphql-python/graphene).
|
||||
|
||||
## Deploying
|
||||
|
||||
Clone the repo:
|
||||
|
||||
```bash
|
||||
git clone https://github.com/hasura/graphql-engine
|
||||
cd graphql-engine/community/boilerplates/graphql-servers/python-flask-graphene
|
||||
```
|
||||
|
||||
### Using Zeit Now
|
||||
|
||||
Install the [Zeit Now](https://zeit.co/now) CLI:
|
||||
|
||||
```bash
|
||||
npm install -g now
|
||||
```
|
||||
|
||||
Deploy the server:
|
||||
```bash
|
||||
now
|
||||
```
|
||||
|
||||
Get the URL and make a sample query:
|
||||
```bash
|
||||
curl https://python-flask-graphene-vynveodwau.now.sh/graphql \
|
||||
-H 'Content-Type:application/json' \
|
||||
-d'{"query":"{ hello }"}'
|
||||
|
||||
{"data":{"hello":"World"}}
|
||||
```
|
||||
|
||||
You can also visit the now url to open GraphiQL:
|
||||
[`https://python-flask-graphene-vynveodwau.now.sh/graphql`](https://python-flask-graphene-vynveodwau.now.sh/graphql).
|
||||
|
||||
|
||||
## Running locally
|
||||
Running the server locally:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
|
||||
export FLASK_APP=server.py
|
||||
flask run
|
||||
```
|
||||
|
||||
Running the server using Docker:
|
||||
|
||||
```bash
|
||||
docker build -t python-flask-graphene .
|
||||
docker run -p 5000:5000 python-flask-graphene
|
||||
```
|
||||
|
||||
GraphQL endpoint will be `http://localhost:5000/graphql`.
|
||||
|
||||
**Note**: When GraphQL Engine is running in a Docker container, `localhost` will
|
||||
point to the containers local interface, not the host's interface. You might
|
||||
have to use the host's docker host IP or a specific DNS label based on your OS.
|
@ -0,0 +1,3 @@
|
||||
{
|
||||
"version": 1
|
||||
}
|
@ -0,0 +1,4 @@
|
||||
Flask==1.0.2
|
||||
Flask-GraphQL==2.0.0
|
||||
graphene==2.1.3
|
||||
gunicorn==19.9.0
|
@ -0,0 +1,41 @@
|
||||
from flask import Flask
|
||||
from flask_graphql import GraphQLView
|
||||
import graphene
|
||||
|
||||
# create the flask application
|
||||
app = Flask(__name__)
|
||||
|
||||
# a global variale to store count temporarily
|
||||
count = 0
|
||||
|
||||
# create a query root using graphene
|
||||
class Query(graphene.ObjectType):
|
||||
# create a graphql node
|
||||
hello = graphene.String(description='A node which says Hello World!')
|
||||
# write resolver for this node
|
||||
def resolve_hello(self, info):
|
||||
return "Hello World!"
|
||||
|
||||
count = graphene.Int(description='Current value of the counter')
|
||||
def resolve_count(self, info):
|
||||
return count
|
||||
|
||||
|
||||
# increment_counter mutation
|
||||
class IncrementCounter(graphene.Mutation):
|
||||
new_count = graphene.Int(description='Updated value of the coutner')
|
||||
|
||||
def mutate(self, info):
|
||||
global count
|
||||
count+=1
|
||||
return IncrementCounter(new_count=count)
|
||||
|
||||
# mutation root
|
||||
class Mutation(graphene.ObjectType):
|
||||
increment_counter = IncrementCounter.Field(description='Increment the value of counter by 1')
|
||||
|
||||
# create a schema object using the query and mutation roots
|
||||
schema = graphene.Schema(query=Query, mutation=Mutation)
|
||||
|
||||
# bind the graphql view to the flask application
|
||||
app.add_url_rule('/graphql', view_func=GraphQLView.as_view('graphql', schema=schema, graphiql=True))
|
21
console/cypress/helpers/remoteSchemaHelpers.js
Normal file
@ -0,0 +1,21 @@
|
||||
export const baseUrl = Cypress.config('baseUrl');
|
||||
export const getRemoteSchemaName = (i, schemaName) =>
|
||||
`test-remote-schema-${schemaName}-${i}`;
|
||||
export const getRemoteGraphQLURL = () =>
|
||||
'https://hasura-realtime-poll.herokuapp.com/v1alpha1/graphql';
|
||||
export const getRemoteGraphQLURLFromEnv = () => 'GRAPHQL_URL';
|
||||
export const getInvalidRemoteSchemaUrl = () => 'http://httpbin.org/post';
|
||||
export const getHeaderAccessKey = i => `ACCESS_KEY-${i}`;
|
||||
export const getHeaderAccessKeyValue = () => 'b94264abx98';
|
||||
|
||||
export const getElementFromAlias = alias => `[data-test=${alias}]`;
|
||||
export const makeDataAPIUrl = dataApiUrl => `${dataApiUrl}/v1/query`;
|
||||
export const makeDataAPIOptions = (dataApiUrl, key, body) => ({
|
||||
method: 'POST',
|
||||
url: makeDataAPIUrl(dataApiUrl),
|
||||
headers: {
|
||||
'x-hasura-access-key': key,
|
||||
},
|
||||
body,
|
||||
failOnStatusCode: false,
|
||||
});
|
@ -0,0 +1,277 @@
|
||||
import {
|
||||
getElementFromAlias,
|
||||
baseUrl,
|
||||
getRemoteSchemaName,
|
||||
getInvalidRemoteSchemaUrl,
|
||||
getRemoteGraphQLURL,
|
||||
getRemoteGraphQLURLFromEnv,
|
||||
} from '../../../helpers/remoteSchemaHelpers';
|
||||
|
||||
import { validateRS } from '../../validators/validators';
|
||||
|
||||
const testName = 'rs';
|
||||
|
||||
export const checkCreateRemoteSchemaRoute = () => {
|
||||
cy.visit('/remote-schemas/manage/schemas', {
|
||||
onBeforeLoad(win) {
|
||||
cy.stub(win, 'prompt').returns('DELETE');
|
||||
},
|
||||
});
|
||||
|
||||
cy.wait(2000);
|
||||
cy.get(getElementFromAlias('data-create-remote-schemas')).click();
|
||||
cy.url().should('eq', `${baseUrl}/remote-schemas/manage/add`);
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const failRSWithInvalidRemoteUrl = () => {
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name')).type(
|
||||
getRemoteSchemaName(0, testName)
|
||||
);
|
||||
cy.get(getElementFromAlias('remote-schema-graphql-url-input')).type(
|
||||
getInvalidRemoteSchemaUrl()
|
||||
);
|
||||
|
||||
cy.get(getElementFromAlias('add-remote-schema-submit')).click();
|
||||
|
||||
validateRS(getRemoteSchemaName(0, testName), 'failure');
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const createSimpleRemoteSchema = () => {
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name'))
|
||||
.clear()
|
||||
.type(getRemoteSchemaName(1, testName));
|
||||
cy.get(getElementFromAlias('remote-schema-graphql-url-input'))
|
||||
.clear()
|
||||
.type(getRemoteGraphQLURL());
|
||||
cy.get(getElementFromAlias('add-remote-schema-submit')).click();
|
||||
cy.wait(10000);
|
||||
validateRS(getRemoteSchemaName(1, testName), 'success');
|
||||
cy.url().should(
|
||||
'eq',
|
||||
`${baseUrl}/remote-schemas/manage/${getRemoteSchemaName(
|
||||
1,
|
||||
testName
|
||||
)}/details`
|
||||
);
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const failRSDuplicateSchemaName = () => {
|
||||
cy.visit('remote-schemas/manage/add');
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name'))
|
||||
.clear()
|
||||
.type(getRemoteSchemaName(1, testName));
|
||||
cy.get(getElementFromAlias('remote-schema-graphql-url-input'))
|
||||
.clear()
|
||||
.type(getRemoteGraphQLURL());
|
||||
cy.get(getElementFromAlias('add-remote-schema-submit')).click();
|
||||
cy.wait(5000);
|
||||
cy.url().should('eq', `${baseUrl}/remote-schemas/manage/add`);
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const failRSDuplicateSchemaNodes = () => {
|
||||
cy.visit('remote-schemas/manage/add');
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name'))
|
||||
.clear()
|
||||
.type(getRemoteSchemaName(2, testName));
|
||||
cy.get(getElementFromAlias('remote-schema-graphql-url-input'))
|
||||
.clear()
|
||||
.type(getRemoteGraphQLURL());
|
||||
cy.get(getElementFromAlias('add-remote-schema-submit')).click();
|
||||
cy.wait(5000);
|
||||
cy.url().should('eq', `${baseUrl}/remote-schemas/manage/add`);
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const deleteSimpleRemoteSchemaFailUserConfirmationError = () => {
|
||||
cy.visit(
|
||||
`remote-schemas/manage/${getRemoteSchemaName(1, testName)}/details`,
|
||||
{
|
||||
onBeforeLoad(win) {
|
||||
cy.stub(win, 'prompt').returns('InvalidInput');
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
cy.get(getElementFromAlias('remote-schemas-modify')).click();
|
||||
cy.wait(5000);
|
||||
cy.get(getElementFromAlias('remote-schema-edit-delete-btn')).click();
|
||||
cy.wait(5000);
|
||||
cy.window()
|
||||
.its('prompt')
|
||||
.should('be.called');
|
||||
|
||||
cy.get(getElementFromAlias('delete-confirmation-error')).should('exist');
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const deleteSimpleRemoteSchema = () => {
|
||||
// Are you absolutely sure?\nThis action cannot be undone. This will permanently delete stitched GraphQL schema. Please type "DELETE" (in caps, without quotes) to confirm.\n
|
||||
cy.visit(
|
||||
`remote-schemas/manage/${getRemoteSchemaName(1, testName)}/details`,
|
||||
{
|
||||
onBeforeLoad(win) {
|
||||
cy.stub(win, 'prompt').returns('DELETE');
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
cy.get(getElementFromAlias('remote-schemas-modify')).click();
|
||||
cy.wait(5000);
|
||||
cy.get(getElementFromAlias('remote-schema-edit-delete-btn')).click();
|
||||
cy.wait(5000);
|
||||
cy.window()
|
||||
.its('prompt')
|
||||
.should('be.called');
|
||||
cy.get(getElementFromAlias('delete-confirmation-error')).should('not.exist');
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const failWithRemoteSchemaEnvUrl = () => {
|
||||
cy.visit('remote-schemas/manage/add');
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name'))
|
||||
.clear()
|
||||
.type(getRemoteSchemaName(3, testName));
|
||||
cy.get(
|
||||
getElementFromAlias('remote-schema-graphql-url-dropdown-button')
|
||||
).click();
|
||||
cy.get(
|
||||
getElementFromAlias('remote-schema-graphql-url-dropdown-item-2')
|
||||
).click();
|
||||
cy.get(getElementFromAlias('remote-schema-graphql-url-input'))
|
||||
.clear()
|
||||
.type(getRemoteGraphQLURLFromEnv());
|
||||
cy.get(getElementFromAlias('add-remote-schema-submit')).click();
|
||||
cy.wait(5000);
|
||||
cy.url().should('eq', `${baseUrl}/remote-schemas/manage/add`);
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const failWithRemoteSchemaEnvHeader = () => {
|
||||
cy.visit('remote-schemas/manage/add');
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name'))
|
||||
.clear()
|
||||
.type(getRemoteSchemaName(3, testName));
|
||||
cy.get(getElementFromAlias('remote-schema-graphql-url-input'))
|
||||
.clear()
|
||||
.type(getRemoteGraphQLURL());
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test1-key'))
|
||||
.clear()
|
||||
.type('sampleHeader1');
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test1-input'))
|
||||
.clear()
|
||||
.type('sampleHeaderValue1');
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test2-key'))
|
||||
.clear()
|
||||
.type('sampleHeader2');
|
||||
|
||||
cy.get(
|
||||
getElementFromAlias('remote-schema-header-test2-dropdown-button')
|
||||
).click();
|
||||
|
||||
cy.get(
|
||||
getElementFromAlias('remote-schema-header-test2-dropdown-item-2')
|
||||
).click();
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test2-input'))
|
||||
.clear()
|
||||
.type('SAMPLE_ENV_HEADER');
|
||||
cy.get(getElementFromAlias('add-remote-schema-submit')).click();
|
||||
cy.wait(5000);
|
||||
cy.url().should('eq', `${baseUrl}/remote-schemas/manage/add`);
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const passWithRemoteSchemaHeader = () => {
|
||||
cy.visit('remote-schemas/manage/add');
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name'))
|
||||
.clear()
|
||||
.type(getRemoteSchemaName(3, testName));
|
||||
cy.get(getElementFromAlias('remote-schema-graphql-url-input'))
|
||||
.clear()
|
||||
.type(getRemoteGraphQLURL());
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test1-key'))
|
||||
.clear()
|
||||
.type('sampleHeader1');
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test1-input'))
|
||||
.clear()
|
||||
.type('sampleHeaderValue1');
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test2-key'))
|
||||
.clear()
|
||||
.type('sampleHeader2');
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-header-test2-input'))
|
||||
.clear()
|
||||
.type('sampleHeaderValue2');
|
||||
|
||||
cy.get(getElementFromAlias('add-remote-schema-submit')).click();
|
||||
cy.wait(5000);
|
||||
validateRS(getRemoteSchemaName(3, testName), 'success');
|
||||
cy.url().should(
|
||||
'eq',
|
||||
`${baseUrl}/remote-schemas/manage/${getRemoteSchemaName(
|
||||
3,
|
||||
testName
|
||||
)}/details`
|
||||
);
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const passWithEditRemoteSchema = () => {
|
||||
cy.visit(
|
||||
`${baseUrl}/remote-schemas/manage/${getRemoteSchemaName(
|
||||
3,
|
||||
testName
|
||||
)}/modify`
|
||||
);
|
||||
cy.wait(3000);
|
||||
cy.get(getElementFromAlias('remote-schema-edit-modify-btn'))
|
||||
.should('exist')
|
||||
.click();
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name'))
|
||||
.clear()
|
||||
.type(getRemoteSchemaName(5, testName));
|
||||
|
||||
cy.get(getElementFromAlias('remote-schema-edit-save-btn')).click();
|
||||
cy.wait(5000);
|
||||
validateRS(getRemoteSchemaName(5, testName), 'success');
|
||||
|
||||
cy.get(getElementFromAlias('remote-schemas-modify')).click();
|
||||
cy.get(getElementFromAlias('remote-schema-schema-name')).should(
|
||||
'have.attr',
|
||||
'value',
|
||||
getRemoteSchemaName(5, testName)
|
||||
);
|
||||
cy.get(getElementFromAlias('remote-schema-edit-modify-btn')).should('exist');
|
||||
cy.wait(5000);
|
||||
};
|
||||
|
||||
export const deleteRemoteSchema = () => {
|
||||
cy.visit(
|
||||
`remote-schemas/manage/${getRemoteSchemaName(5, testName)}/details`,
|
||||
{
|
||||
onBeforeLoad(win) {
|
||||
cy.stub(win, 'prompt').returns('DELETE');
|
||||
},
|
||||
}
|
||||
);
|
||||
|
||||
cy.get(getElementFromAlias('remote-schemas-modify')).click();
|
||||
cy.wait(5000);
|
||||
cy.get(getElementFromAlias('remote-schema-edit-delete-btn')).click();
|
||||
cy.wait(5000);
|
||||
cy.window()
|
||||
.its('prompt')
|
||||
.should('be.called');
|
||||
|
||||
cy.get(getElementFromAlias('delete-confirmation-error')).should('not.exist');
|
||||
};
|
@ -0,0 +1,68 @@
|
||||
/* eslint no-unused-vars: 0 */
|
||||
/* eslint import/prefer-default-export: 0 */
|
||||
import { testMode } from '../../../helpers/common';
|
||||
import { setMetaData } from '../../validators/validators';
|
||||
|
||||
import {
|
||||
checkCreateRemoteSchemaRoute,
|
||||
failRSWithInvalidRemoteUrl,
|
||||
createSimpleRemoteSchema,
|
||||
failRSDuplicateSchemaName,
|
||||
failRSDuplicateSchemaNodes,
|
||||
deleteSimpleRemoteSchema,
|
||||
deleteSimpleRemoteSchemaFailUserConfirmationError,
|
||||
failWithRemoteSchemaEnvUrl,
|
||||
failWithRemoteSchemaEnvHeader,
|
||||
passWithRemoteSchemaHeader,
|
||||
passWithEditRemoteSchema,
|
||||
deleteRemoteSchema,
|
||||
} from './spec';
|
||||
|
||||
const setup = () => {
|
||||
describe('Setup route', () => {
|
||||
it('Visit the index route', () => {
|
||||
// Visit the index route
|
||||
cy.visit('/remote-schemas/manage/schemas');
|
||||
cy.wait(7000);
|
||||
// Get and set validation metadata
|
||||
setMetaData();
|
||||
});
|
||||
});
|
||||
};
|
||||
|
||||
export const runCreateRemoteSchemaTableTests = () => {
|
||||
describe('Create Remote Schema', () => {
|
||||
it(
|
||||
'Create table button opens the correct route',
|
||||
checkCreateRemoteSchemaRoute
|
||||
);
|
||||
it(
|
||||
'Fails to create remote schema without name',
|
||||
failRSWithInvalidRemoteUrl
|
||||
);
|
||||
it('Create a simple remote schema', createSimpleRemoteSchema);
|
||||
it('Fails to add remote schema with same name', failRSDuplicateSchemaName);
|
||||
it(
|
||||
'Fails to add remote schema which is already added',
|
||||
failRSDuplicateSchemaNodes
|
||||
);
|
||||
it(
|
||||
'Delete simple remote schema fail due to user confirmation error',
|
||||
deleteSimpleRemoteSchemaFailUserConfirmationError
|
||||
);
|
||||
it('Delete simple remote schema', deleteSimpleRemoteSchema);
|
||||
it('Create remote schema with url from env', failWithRemoteSchemaEnvUrl);
|
||||
it(
|
||||
'Create remote schema with headers from env',
|
||||
failWithRemoteSchemaEnvHeader
|
||||
);
|
||||
it('Create remote schema with headers', passWithRemoteSchemaHeader);
|
||||
it('Edit remote schema with headers', passWithEditRemoteSchema);
|
||||
it('Delete remote schema with headers', deleteRemoteSchema);
|
||||
});
|
||||
};
|
||||
|
||||
if (testMode !== 'cli') {
|
||||
setup();
|
||||
runCreateRemoteSchemaTableTests();
|
||||
}
|
@ -30,6 +30,35 @@ export const createView = sql => {
|
||||
|
||||
// ******************* VALIDATION FUNCTIONS *******************************
|
||||
|
||||
// ******************* Remote schema Validator ****************************
|
||||
export const validateRS = (remoteSchemaName, result) => {
|
||||
const reqBody = {
|
||||
type: 'select',
|
||||
args: {
|
||||
table: {
|
||||
name: 'remote_schemas',
|
||||
schema: 'hdb_catalog',
|
||||
},
|
||||
columns: ['*'],
|
||||
where: {
|
||||
name: remoteSchemaName,
|
||||
},
|
||||
},
|
||||
};
|
||||
const requestOptions = makeDataAPIOptions(dataApiUrl, accessKey, reqBody);
|
||||
cy.request(requestOptions).then(response => {
|
||||
if (result === 'success') {
|
||||
expect(
|
||||
response.body.length > 0 && response.body[0].name === remoteSchemaName
|
||||
).to.be.true;
|
||||
} else {
|
||||
expect(
|
||||
response.body.length > 0 && response.body[0].name === remoteSchemaName
|
||||
).to.be.false;
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
// ****************** Table Validator *********************
|
||||
|
||||
export const validateCT = (tableName, result) => {
|
||||
|
@ -125,6 +125,7 @@ table thead tr th
|
||||
margin-left: 10px;
|
||||
width: 15px;
|
||||
min-width: 15px;
|
||||
cursor: pointer;
|
||||
}
|
||||
.display_flex
|
||||
{
|
||||
@ -753,6 +754,20 @@ code
|
||||
{
|
||||
outline: none;
|
||||
}
|
||||
.danger_button
|
||||
{
|
||||
border-radius: 5px;
|
||||
// color: #292929;
|
||||
color: #000;
|
||||
border: 1px solid #c9302c;
|
||||
padding: 5px 10px;
|
||||
font-size: 12px;
|
||||
line-height: 1.5;
|
||||
}
|
||||
.danger_button:focus
|
||||
{
|
||||
outline: none;
|
||||
}
|
||||
.exploreButton
|
||||
{
|
||||
background-color: #FEC53D;
|
||||
@ -795,6 +810,10 @@ code
|
||||
padding-bottom: 20px;
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
i
|
||||
{
|
||||
margin-left: 5px;
|
||||
}
|
||||
}
|
||||
.inline_block
|
||||
{
|
||||
@ -897,6 +916,57 @@ code
|
||||
}
|
||||
}
|
||||
}
|
||||
.common_nav {
|
||||
border-bottom: 1px solid #E6E6E6;
|
||||
margin-left: -15px !important;
|
||||
padding-left: 15px !important;
|
||||
ul
|
||||
{
|
||||
margin-bottom: -2px;
|
||||
li
|
||||
{
|
||||
border: 1px solid transparent;
|
||||
border-top: 3px solid transparent;
|
||||
margin-right: 10px;
|
||||
background-color: #e6e6e6;
|
||||
border-radius: 4px;
|
||||
border-bottom-left-radius: 0;
|
||||
border-bottom-right-radius: 0;
|
||||
margin-bottom: 1px;
|
||||
font-weight: bold;
|
||||
a
|
||||
{
|
||||
color: #666666;
|
||||
padding: 12px 16px;
|
||||
}
|
||||
a:hover
|
||||
{
|
||||
background-color: transparent;
|
||||
}
|
||||
a:focus
|
||||
{
|
||||
background-color: transparent;
|
||||
}
|
||||
}
|
||||
.active
|
||||
{
|
||||
border-radius: 4px;
|
||||
background-color: #f8fafb;
|
||||
/* color: #333; */
|
||||
border: 1px solid #E6E6E6;
|
||||
border-bottom: 0;
|
||||
border-radius: 4px;
|
||||
border-bottom-right-radius: 0;
|
||||
border-bottom-left-radius: 0;
|
||||
border-top: 3px solid #FFC627;
|
||||
a
|
||||
{
|
||||
color: #333;
|
||||
margin-bottom: 1px;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.header {
|
||||
background: #fff;
|
||||
width: 100%;
|
||||
|
@ -23,6 +23,7 @@ class Main extends React.Component {
|
||||
this.state = {
|
||||
showBannerNotification: false,
|
||||
showEvents: false,
|
||||
showSchemaStitch: false,
|
||||
};
|
||||
|
||||
this.state.loveConsentState = getLoveConsentState();
|
||||
@ -37,10 +38,9 @@ class Main extends React.Component {
|
||||
dispatch(checkServerUpdates()).then(() => {
|
||||
let isUpdateAvailable = false;
|
||||
try {
|
||||
const showEvents = semverCheck('eventsTab', this.props.serverVersion);
|
||||
if (showEvents) {
|
||||
this.setState({ showEvents: true });
|
||||
}
|
||||
this.checkEventsTab().then(() => {
|
||||
this.checkSchemaStitch();
|
||||
});
|
||||
isUpdateAvailable = semver.gt(
|
||||
this.props.latestServerVersion,
|
||||
this.props.serverVersion
|
||||
@ -56,11 +56,27 @@ class Main extends React.Component {
|
||||
}
|
||||
} catch (e) {
|
||||
console.error(e);
|
||||
this.setState({ showEvents: true });
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
checkSchemaStitch() {
|
||||
const showSchemaStitch = semverCheck(
|
||||
'schemaStitching',
|
||||
this.props.serverVersion
|
||||
);
|
||||
if (showSchemaStitch) {
|
||||
this.setState({ ...this.state, showSchemaStitch: true });
|
||||
}
|
||||
return Promise.resolve();
|
||||
}
|
||||
checkEventsTab() {
|
||||
const showEvents = semverCheck('eventsTab', this.props.serverVersion);
|
||||
if (showEvents) {
|
||||
this.setState({ showEvents: true });
|
||||
}
|
||||
return Promise.resolve();
|
||||
}
|
||||
handleBodyClick(e) {
|
||||
const heartDropDownOpen = document.querySelectorAll(
|
||||
'#dropdown_wrapper.open'
|
||||
@ -212,6 +228,32 @@ class Main extends React.Component {
|
||||
</Link>
|
||||
</li>
|
||||
</OverlayTrigger>
|
||||
{this.state.showSchemaStitch ? (
|
||||
<OverlayTrigger
|
||||
placement="right"
|
||||
overlay={tooltip.customresolver}
|
||||
>
|
||||
<li>
|
||||
<Link
|
||||
className={
|
||||
currentActiveBlock === 'remote-schemas'
|
||||
? styles.navSideBarActive
|
||||
: ''
|
||||
}
|
||||
to={appPrefix + '/remote-schemas'}
|
||||
>
|
||||
<div className={styles.iconCenter}>
|
||||
<i
|
||||
title="Remote Schemas"
|
||||
className="fa fa-plug"
|
||||
aria-hidden="true"
|
||||
/>
|
||||
</div>
|
||||
<p>Remote Schemas</p>
|
||||
</Link>
|
||||
</li>
|
||||
</OverlayTrigger>
|
||||
) : null}
|
||||
{this.state.showEvents ? (
|
||||
<OverlayTrigger placement="right" overlay={tooltip.events}>
|
||||
<li>
|
||||
|
@ -13,6 +13,10 @@ export const events = (
|
||||
<Tooltip id="tooltip-events">Manage Event Triggers</Tooltip>
|
||||
);
|
||||
|
||||
export const customresolver = (
|
||||
<Tooltip id="tooltip-customresolver">Manage Remote Schemas</Tooltip>
|
||||
);
|
||||
|
||||
export const secureEndpoint = (
|
||||
<Tooltip id="tooltip-secure-endpoint">
|
||||
This graphql endpoint is public and you should add an access key
|
||||
|
53
console/src/components/Services/CustomResolver/Add/Add.js
Normal file
@ -0,0 +1,53 @@
|
||||
import React from 'react';
|
||||
import Common from '../Common/Common';
|
||||
|
||||
import { addResolver, RESET } from './addResolverReducer';
|
||||
import Helmet from 'react-helmet';
|
||||
|
||||
import { pageTitle } from '../constants';
|
||||
|
||||
class Add extends React.Component {
|
||||
componentWillUnmount() {
|
||||
this.props.dispatch({ type: RESET });
|
||||
}
|
||||
render() {
|
||||
const styles = require('../Styles.scss');
|
||||
const { isRequesting, dispatch } = this.props;
|
||||
return (
|
||||
<div className={styles.addWrapper}>
|
||||
<Helmet title={`Add ${pageTitle} - ${pageTitle}s | Hasura`} />
|
||||
<div className={styles.heading_text}>Add a new remote schema</div>
|
||||
<form
|
||||
onSubmit={e => {
|
||||
e.preventDefault();
|
||||
dispatch(addResolver());
|
||||
}}
|
||||
>
|
||||
<Common {...this.props} />
|
||||
<div className={styles.commonBtn}>
|
||||
<button
|
||||
type="submit"
|
||||
className={styles.yellow_button}
|
||||
disabled={isRequesting}
|
||||
data-test="add-remote-schema-submit"
|
||||
>
|
||||
{isRequesting ? 'Adding...' : 'Add Remote Schema'}
|
||||
</button>
|
||||
{/*
|
||||
<button className={styles.default_button}>Cancel</button>
|
||||
*/}
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const mapStateToProps = state => {
|
||||
return {
|
||||
...state.customResolverData.addData,
|
||||
...state.customResolverData.headerData,
|
||||
};
|
||||
};
|
||||
|
||||
export default connect => connect(mapStateToProps)(Add);
|
@ -0,0 +1,696 @@
|
||||
/* defaultState */
|
||||
import { addState } from '../state';
|
||||
/* */
|
||||
|
||||
import Endpoints, { globalCookiePolicy } from '../../../../Endpoints';
|
||||
import requestAction from '../../../../utils/requestAction';
|
||||
import dataHeaders from '../../Data/Common/Headers';
|
||||
import { push } from 'react-router-redux';
|
||||
import { fetchResolvers } from '../customActions';
|
||||
|
||||
import { generateHeaderSyms } from '../../Layout/ReusableHeader/HeaderReducer';
|
||||
import { makeRequest } from '../customActions';
|
||||
// import { UPDATE_MIGRATION_STATUS_ERROR } from '../../../Main/Actions';
|
||||
import { appPrefix } from '../constants';
|
||||
|
||||
import globals from '../../../../Globals';
|
||||
|
||||
const prefixUrl = globals.urlPrefix + appPrefix;
|
||||
|
||||
/* */
|
||||
const MANUAL_URL_CHANGED = '@addResolver/MANUAL_URL_CHANGED';
|
||||
const ENV_URL_CHANGED = '@addResolver/ENV_URL_CHANGED';
|
||||
const NAME_CHANGED = '@addResolver/NAME_CHANGED';
|
||||
// const HEADER_CHANGED = '@addResolver/HEADER_CHANGED';
|
||||
const ADDING_RESOLVER = '@addResolver/ADDING_RESOLVER';
|
||||
const ADD_RESOLVER_FAIL = '@addResolver/ADD_RESOLVER_FAIL';
|
||||
const RESET = '@addResolver/RESET';
|
||||
const FETCHING_INDIV_RESOLVER = '@addResolver/FETCHING_INDIV_RESOLVER';
|
||||
const RESOLVER_FETCH_SUCCESS = '@addResolver/RESOLVER_FETCH_SUCCESS';
|
||||
const RESOLVER_FETCH_FAIL = '@addResolver/RESOLVER_FETCH_FAIL';
|
||||
|
||||
const DELETING_RESOLVER = '@addResolver/DELETING_RESOLVER';
|
||||
const DELETE_RESOLVER_FAIL = '@addResolver/DELETE_RESOLVER_FAIL';
|
||||
|
||||
const MODIFY_RESOLVER_FAIL = '@addResolver/MODIFY_RESOLVER_FAIL';
|
||||
const MODIFYING_RESOLVER = '@addResolver/MODIFYING_RESOLVER';
|
||||
|
||||
const UPDATE_FORWARD_CLIENT_HEADERS =
|
||||
'@addResolver/UPDATE_FORWARD_CLIENT_HEADERS';
|
||||
|
||||
/* */
|
||||
const TOGGLE_MODIFY = '@editResolver/TOGGLE_MODIFY';
|
||||
/* */
|
||||
/* */
|
||||
|
||||
const inputEventMap = {
|
||||
name: NAME_CHANGED,
|
||||
envName: ENV_URL_CHANGED,
|
||||
manualUrl: MANUAL_URL_CHANGED,
|
||||
};
|
||||
|
||||
/* Action creators */
|
||||
const inputChange = (type, data) => {
|
||||
return dispatch => dispatch({ type: inputEventMap[type], data });
|
||||
};
|
||||
|
||||
const getHeaderEvents = generateHeaderSyms('CUSTOM_RESOLVER');
|
||||
/* */
|
||||
|
||||
const getReqHeader = headers => {
|
||||
const headersObj = headers.filter(h => {
|
||||
return h.name && h.name.length > 0;
|
||||
});
|
||||
const requestHeader =
|
||||
headersObj.length > 0
|
||||
? headersObj.map(h => {
|
||||
const reqHead = {
|
||||
name: h.name,
|
||||
};
|
||||
if (h.type === 'static') {
|
||||
reqHead.value = h.value;
|
||||
} else {
|
||||
reqHead.value_from_env = h.value;
|
||||
}
|
||||
return reqHead;
|
||||
})
|
||||
: [];
|
||||
return requestHeader;
|
||||
};
|
||||
|
||||
const fetchResolver = resolver => {
|
||||
return (dispatch, getState) => {
|
||||
const url = Endpoints.getSchema;
|
||||
const options = {
|
||||
credentials: globalCookiePolicy,
|
||||
method: 'POST',
|
||||
headers: dataHeaders(getState),
|
||||
body: JSON.stringify({
|
||||
type: 'select',
|
||||
args: {
|
||||
table: {
|
||||
name: 'remote_schemas',
|
||||
schema: 'hdb_catalog',
|
||||
},
|
||||
columns: ['*'],
|
||||
where: {
|
||||
name: resolver,
|
||||
},
|
||||
},
|
||||
}),
|
||||
};
|
||||
dispatch({ type: FETCHING_INDIV_RESOLVER });
|
||||
return dispatch(requestAction(url, options)).then(
|
||||
data => {
|
||||
if (data.length > 0) {
|
||||
dispatch({ type: RESOLVER_FETCH_SUCCESS, data: data });
|
||||
const headerObj = [];
|
||||
data[0].definition.headers.forEach(d => {
|
||||
headerObj.push({
|
||||
name: d.name,
|
||||
value: d.value ? d.value : d.value_from_env,
|
||||
type: d.value ? 'static' : 'env',
|
||||
});
|
||||
});
|
||||
headerObj.push({
|
||||
name: '',
|
||||
type: 'static',
|
||||
value: '',
|
||||
});
|
||||
dispatch({
|
||||
type: getHeaderEvents.UPDATE_HEADERS,
|
||||
data: [...headerObj],
|
||||
});
|
||||
return Promise.resolve();
|
||||
}
|
||||
return dispatch(push(`${prefixUrl}`));
|
||||
},
|
||||
error => {
|
||||
console.error('Failed to fetch resolver' + JSON.stringify(error));
|
||||
return dispatch({ type: RESOLVER_FETCH_FAIL, data: error });
|
||||
}
|
||||
);
|
||||
};
|
||||
};
|
||||
|
||||
const addResolver = () => {
|
||||
return (dispatch, getState) => {
|
||||
const currState = getState().customResolverData.addData;
|
||||
// const url = Endpoints.getSchema;
|
||||
const resolveObj = {
|
||||
name: currState.name.trim().replace(/ +/g, ''),
|
||||
definition: {
|
||||
url: currState.manualUrl,
|
||||
url_from_env: currState.envName,
|
||||
headers: [],
|
||||
forward_client_headers: currState.forwardClientHeaders,
|
||||
},
|
||||
};
|
||||
|
||||
resolveObj.definition.headers = [
|
||||
...getReqHeader(getState().customResolverData.headerData.headers),
|
||||
];
|
||||
if (resolveObj.definition.url) {
|
||||
delete resolveObj.definition.url_from_env;
|
||||
} else {
|
||||
delete resolveObj.definition.url;
|
||||
}
|
||||
/* TODO: Add mandatory fields validation */
|
||||
|
||||
const migrationName =
|
||||
'create_remote_schema_' + currState.name.trim().replace(/ +/g, '');
|
||||
|
||||
const payload = {
|
||||
type: 'add_remote_schema',
|
||||
args: {
|
||||
...resolveObj,
|
||||
},
|
||||
};
|
||||
|
||||
const downPayload = {
|
||||
type: 'remove_remote_schema',
|
||||
args: {
|
||||
name: currState.name,
|
||||
},
|
||||
};
|
||||
|
||||
const upQueryArgs = [];
|
||||
upQueryArgs.push(payload);
|
||||
const downQueryArgs = [];
|
||||
downQueryArgs.push(downPayload);
|
||||
const upQuery = {
|
||||
type: 'bulk',
|
||||
args: upQueryArgs,
|
||||
};
|
||||
|
||||
const downQuery = {
|
||||
type: 'bulk',
|
||||
args: downQueryArgs,
|
||||
};
|
||||
|
||||
const requestMsg = 'Adding remote schema...';
|
||||
const successMsg = 'Remote schema added successfully';
|
||||
const errorMsg = 'Adding schema failed';
|
||||
|
||||
const customOnSuccess = data => {
|
||||
Promise.all([
|
||||
dispatch({ type: RESET }),
|
||||
dispatch(push(`${prefixUrl}/manage/${resolveObj.name}/details`)),
|
||||
dispatch(fetchResolvers()),
|
||||
dispatch({ type: getHeaderEvents.RESET_HEADER, data: data }),
|
||||
]);
|
||||
};
|
||||
const customOnError = err => {
|
||||
console.error('Failed to create remote schema' + JSON.stringify(err));
|
||||
dispatch({ type: ADD_RESOLVER_FAIL, data: err });
|
||||
// dispatch({ type: UPDATE_MIGRATION_STATUS_ERROR, data: err });
|
||||
// alert(JSON.stringify(err));
|
||||
};
|
||||
dispatch({ type: ADDING_RESOLVER });
|
||||
return dispatch(
|
||||
makeRequest(
|
||||
upQuery.args,
|
||||
downQuery.args,
|
||||
migrationName,
|
||||
customOnSuccess,
|
||||
customOnError,
|
||||
requestMsg,
|
||||
successMsg,
|
||||
errorMsg
|
||||
)
|
||||
);
|
||||
};
|
||||
};
|
||||
|
||||
const deleteResolver = () => {
|
||||
return (dispatch, getState) => {
|
||||
const currState = getState().customResolverData.addData;
|
||||
// const url = Endpoints.getSchema;
|
||||
const resolveObj = {
|
||||
name: currState.editState.originalName,
|
||||
};
|
||||
const migrationName =
|
||||
'remove_remote_schema_' + resolveObj.name.trim().replace(/ +/g, '');
|
||||
const payload = {
|
||||
type: 'remove_remote_schema',
|
||||
args: {
|
||||
name: currState.editState.originalName,
|
||||
},
|
||||
};
|
||||
const downPayload = {
|
||||
type: 'add_remote_schema',
|
||||
args: {
|
||||
name: currState.editState.originalName,
|
||||
definition: {
|
||||
url: currState.editState.originalUrl,
|
||||
url_from_env: currState.editState.originalEnvUrl,
|
||||
headers: [],
|
||||
forward_client_headers:
|
||||
currState.editState.originalForwardClientHeaders,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
downPayload.args.definition.headers = [
|
||||
...currState.editState.originalHeaders,
|
||||
];
|
||||
|
||||
const upQueryArgs = [];
|
||||
upQueryArgs.push(payload);
|
||||
const downQueryArgs = [];
|
||||
downQueryArgs.push(downPayload);
|
||||
const upQuery = {
|
||||
type: 'bulk',
|
||||
args: upQueryArgs,
|
||||
};
|
||||
const downQuery = {
|
||||
type: 'bulk',
|
||||
args: downQueryArgs,
|
||||
};
|
||||
const requestMsg = 'Deleting remote schema...';
|
||||
const successMsg = 'Remote schema deleted successfully';
|
||||
const errorMsg = 'Delete remote schema failed';
|
||||
|
||||
const customOnSuccess = () => {
|
||||
// dispatch({ type: REQUEST_SUCCESS });
|
||||
Promise.all([
|
||||
dispatch({ type: RESET }),
|
||||
dispatch(push(prefixUrl)),
|
||||
dispatch(fetchResolvers()),
|
||||
]);
|
||||
};
|
||||
const customOnError = error => {
|
||||
Promise.all([dispatch({ type: DELETE_RESOLVER_FAIL, data: error })]);
|
||||
};
|
||||
|
||||
dispatch({ type: DELETING_RESOLVER });
|
||||
return dispatch(
|
||||
makeRequest(
|
||||
upQuery.args,
|
||||
downQuery.args,
|
||||
migrationName,
|
||||
customOnSuccess,
|
||||
customOnError,
|
||||
requestMsg,
|
||||
successMsg,
|
||||
errorMsg
|
||||
)
|
||||
);
|
||||
};
|
||||
};
|
||||
|
||||
const modifyResolver = () => {
|
||||
return (dispatch, getState) => {
|
||||
const currState = getState().customResolverData.addData;
|
||||
// const url = Endpoints.getSchema;
|
||||
const upQueryArgs = [];
|
||||
const downQueryArgs = [];
|
||||
const migrationName =
|
||||
'update_remote_schema_' + currState.name.trim().replace(/ +/g, '');
|
||||
const schemaName = currState.name.trim().replace(/ +/g, '');
|
||||
const deleteResolverUp = {
|
||||
type: 'remove_remote_schema',
|
||||
args: {
|
||||
name: currState.editState.originalName,
|
||||
},
|
||||
};
|
||||
const trimmedName = currState.name.trim().replace(/ +/g, '');
|
||||
const resolveObj = {
|
||||
name: trimmedName,
|
||||
definition: {
|
||||
url: currState.manualUrl,
|
||||
url_from_env: currState.envName,
|
||||
forward_client_headers: currState.forwardClientHeaders,
|
||||
headers: [],
|
||||
},
|
||||
};
|
||||
|
||||
resolveObj.definition.headers = [
|
||||
...getReqHeader(getState().customResolverData.headerData.headers),
|
||||
];
|
||||
if (resolveObj.definition.url) {
|
||||
delete resolveObj.definition.url_from_env;
|
||||
} else {
|
||||
delete resolveObj.definition.url;
|
||||
}
|
||||
|
||||
const createResolverUp = {
|
||||
type: 'add_remote_schema',
|
||||
args: {
|
||||
...resolveObj,
|
||||
},
|
||||
};
|
||||
upQueryArgs.push(deleteResolverUp);
|
||||
upQueryArgs.push(createResolverUp);
|
||||
|
||||
// Delete the new one and create the old one
|
||||
const deleteResolverDown = {
|
||||
type: 'remove_remote_schema',
|
||||
args: {
|
||||
name: trimmedName,
|
||||
},
|
||||
};
|
||||
const resolveDownObj = {
|
||||
name: currState.editState.originalName,
|
||||
definition: {
|
||||
url: currState.editState.originalUrl,
|
||||
url_from_env: currState.editState.originalEnvUrl,
|
||||
headers: [],
|
||||
forward_client_headers:
|
||||
currState.editState.originalForwardClientHeaders,
|
||||
},
|
||||
};
|
||||
|
||||
resolveDownObj.definition.headers = [
|
||||
...currState.editState.originalHeaders,
|
||||
];
|
||||
if (resolveDownObj.definition.url) {
|
||||
delete resolveDownObj.definition.url_from_env;
|
||||
} else {
|
||||
delete resolveDownObj.definition.url;
|
||||
}
|
||||
|
||||
const createResolverDown = {
|
||||
type: 'add_remote_schema',
|
||||
args: {
|
||||
...resolveDownObj,
|
||||
},
|
||||
};
|
||||
downQueryArgs.push(deleteResolverDown);
|
||||
downQueryArgs.push(createResolverDown);
|
||||
// End of down
|
||||
|
||||
const upQuery = {
|
||||
type: 'bulk',
|
||||
args: upQueryArgs,
|
||||
};
|
||||
const downQuery = {
|
||||
type: 'bulk',
|
||||
args: downQueryArgs,
|
||||
};
|
||||
const requestMsg = 'Modifying remote schema...';
|
||||
const successMsg = 'Remote schema modified';
|
||||
const errorMsg = 'Modify remote schema failed';
|
||||
|
||||
const customOnSuccess = data => {
|
||||
// dispatch({ type: REQUEST_SUCCESS });
|
||||
Promise.all([
|
||||
dispatch({ type: RESET, data: data }),
|
||||
dispatch(fetchResolvers()),
|
||||
]).then(() => {
|
||||
return Promise.all([
|
||||
dispatch(fetchResolver(schemaName)),
|
||||
dispatch(push(`${prefixUrl}/manage/${trimmedName}/details`)),
|
||||
]);
|
||||
});
|
||||
};
|
||||
const customOnError = error => {
|
||||
Promise.all([dispatch({ type: MODIFY_RESOLVER_FAIL, data: error })]);
|
||||
};
|
||||
|
||||
dispatch({ type: MODIFYING_RESOLVER });
|
||||
return dispatch(
|
||||
makeRequest(
|
||||
upQuery.args,
|
||||
downQuery.args,
|
||||
migrationName,
|
||||
customOnSuccess,
|
||||
customOnError,
|
||||
requestMsg,
|
||||
successMsg,
|
||||
errorMsg
|
||||
)
|
||||
);
|
||||
};
|
||||
};
|
||||
|
||||
/*
|
||||
const modifyResolver = () => {
|
||||
return (dispatch, getState) => {
|
||||
const currState = getState().customResolverData.addData;
|
||||
// const url = Endpoints.getSchema;
|
||||
let upQueryArgs = [];
|
||||
let downQueryArgs = [];
|
||||
const migrationName = 'update_add_schema_' + currState.name.trim();
|
||||
const schemaName = currState.name.trim();
|
||||
const deleteResolverUp = {
|
||||
type: 'remove_remote_schema',
|
||||
args: {
|
||||
name: currState.editState.originalName,
|
||||
},
|
||||
};
|
||||
|
||||
upQueryArgs.push(deleteResolverUp);
|
||||
|
||||
// Delete the new one and create the old one
|
||||
const resolveDownObj = {
|
||||
name: currState.editState.originalName,
|
||||
url: currState.editState.originalUrl,
|
||||
url_from_env: currState.editState.originalEnvUrl,
|
||||
headers: [],
|
||||
};
|
||||
|
||||
resolveDownObj.headers = [...currState.editState.originalHeaders];
|
||||
if (resolveDownObj.url) {
|
||||
delete resolveDownObj.url_from_env;
|
||||
} else {
|
||||
delete resolveDownObj.url;
|
||||
}
|
||||
|
||||
const createResolverDown = {
|
||||
type: 'add_remote_schema',
|
||||
args: {
|
||||
...resolveDownObj,
|
||||
},
|
||||
};
|
||||
downQueryArgs.push(createResolverDown);
|
||||
|
||||
let upQuery = {
|
||||
type: 'bulk',
|
||||
args: upQueryArgs,
|
||||
};
|
||||
let downQuery = {
|
||||
type: 'bulk',
|
||||
args: downQueryArgs,
|
||||
};
|
||||
const requestMsg = 'Modifying schema...';
|
||||
const successMsg = 'Schema modified';
|
||||
const errorMsg = 'Modify schema failed';
|
||||
|
||||
const customOnSuccess = () => {
|
||||
// dispatch({ type: REQUEST_SUCCESS });
|
||||
// Do the modify thing here
|
||||
upQueryArgs = [];
|
||||
downQueryArgs = [];
|
||||
const resolveObj = {
|
||||
name: currState.name.trim(),
|
||||
url: currState.manualUrl,
|
||||
url_from_env: currState.envName,
|
||||
headers: [],
|
||||
};
|
||||
|
||||
resolveObj.headers = [
|
||||
...getReqHeader(getState().customResolverData.headerData.headers),
|
||||
];
|
||||
if (resolveObj.url) {
|
||||
delete resolveObj.url_from_env;
|
||||
} else {
|
||||
delete resolveObj.url;
|
||||
}
|
||||
|
||||
const createResolverUp = {
|
||||
type: 'add_remote_schema',
|
||||
args: {
|
||||
...resolveObj,
|
||||
},
|
||||
};
|
||||
upQueryArgs.push(createResolverUp);
|
||||
|
||||
const deleteResolverDown = {
|
||||
type: 'remove_remote_schema',
|
||||
args: {
|
||||
name: currState.name,
|
||||
},
|
||||
};
|
||||
downQueryArgs.push(deleteResolverDown);
|
||||
|
||||
upQuery = {
|
||||
type: 'bulk',
|
||||
args: upQueryArgs,
|
||||
};
|
||||
downQuery = {
|
||||
type: 'bulk',
|
||||
args: downQueryArgs,
|
||||
};
|
||||
|
||||
const tOnSuccess = () => {
|
||||
Promise.all([
|
||||
dispatch({ type: RESET }),
|
||||
dispatch(fetchResolvers()),
|
||||
]).then(() => {
|
||||
return dispatch(fetchResolver(schemaName));
|
||||
});
|
||||
};
|
||||
const tOnError = error => {
|
||||
Promise.all([dispatch({ type: MODIFY_RESOLVER_FAIL, data: error })]);
|
||||
};
|
||||
|
||||
return dispatch(
|
||||
makeRequest(
|
||||
upQuery.args,
|
||||
downQuery.args,
|
||||
migrationName,
|
||||
tOnSuccess,
|
||||
tOnError,
|
||||
requestMsg,
|
||||
successMsg,
|
||||
errorMsg
|
||||
)
|
||||
);
|
||||
};
|
||||
const customOnError = error => {
|
||||
Promise.all([dispatch({ type: MODIFY_RESOLVER_FAIL, data: error })]);
|
||||
};
|
||||
|
||||
dispatch({ type: MODIFYING_RESOLVER });
|
||||
return dispatch(
|
||||
makeRequest(
|
||||
upQuery.args,
|
||||
downQuery.args,
|
||||
migrationName,
|
||||
customOnSuccess,
|
||||
customOnError,
|
||||
requestMsg
|
||||
)
|
||||
);
|
||||
};
|
||||
};
|
||||
*/
|
||||
|
||||
const addResolverReducer = (state = addState, action) => {
|
||||
switch (action.type) {
|
||||
case MANUAL_URL_CHANGED:
|
||||
return {
|
||||
...state,
|
||||
manualUrl: action.data,
|
||||
envName: null,
|
||||
};
|
||||
case NAME_CHANGED:
|
||||
return {
|
||||
...state,
|
||||
name: action.data,
|
||||
};
|
||||
case ENV_URL_CHANGED:
|
||||
return {
|
||||
...state,
|
||||
envName: action.data,
|
||||
manualUrl: null,
|
||||
};
|
||||
case ADDING_RESOLVER:
|
||||
return {
|
||||
...state,
|
||||
isRequesting: true,
|
||||
isError: null,
|
||||
};
|
||||
case ADD_RESOLVER_FAIL:
|
||||
return {
|
||||
...state,
|
||||
isRequesting: false,
|
||||
isError: action.data,
|
||||
};
|
||||
case TOGGLE_MODIFY:
|
||||
return {
|
||||
...state,
|
||||
headers: [...state.editState.headers],
|
||||
editState: {
|
||||
...state.editState,
|
||||
isModify: !state.editState.isModify,
|
||||
},
|
||||
};
|
||||
|
||||
case RESET:
|
||||
return {
|
||||
...addState,
|
||||
};
|
||||
case FETCHING_INDIV_RESOLVER:
|
||||
return {
|
||||
...state,
|
||||
isFetching: true,
|
||||
isFetchError: null,
|
||||
};
|
||||
case RESOLVER_FETCH_SUCCESS:
|
||||
return {
|
||||
...state,
|
||||
name: action.data[0].name,
|
||||
manualUrl: action.data[0].definition.url || null,
|
||||
envName: action.data[0].definition.url_from_env || null,
|
||||
headers: action.data[0].definition.headers || [],
|
||||
forwardClientHeaders: action.data[0].definition.forward_client_headers,
|
||||
editState: {
|
||||
...state,
|
||||
id: action.data[0].id,
|
||||
isModify: false,
|
||||
originalName: action.data[0].name,
|
||||
originalHeaders: action.data[0].definition.headers || [],
|
||||
originalUrl: action.data[0].definition.url || null,
|
||||
originalEnvUrl: action.data[0].definition.url_from_env || null,
|
||||
originalForwardClientHeaders:
|
||||
action.data[0].definition.forward_client_headers || false,
|
||||
},
|
||||
isFetching: false,
|
||||
isFetchError: null,
|
||||
};
|
||||
case RESOLVER_FETCH_FAIL:
|
||||
return {
|
||||
...state,
|
||||
isFetching: false,
|
||||
isFetchError: action.data,
|
||||
};
|
||||
case DELETE_RESOLVER_FAIL:
|
||||
return {
|
||||
...state,
|
||||
isRequesting: false,
|
||||
isError: action.data,
|
||||
};
|
||||
case DELETING_RESOLVER:
|
||||
return {
|
||||
...state,
|
||||
isRequesting: true,
|
||||
isError: null,
|
||||
};
|
||||
case MODIFY_RESOLVER_FAIL:
|
||||
return {
|
||||
...state,
|
||||
isRequesting: false,
|
||||
isError: action.data,
|
||||
};
|
||||
case MODIFYING_RESOLVER:
|
||||
return {
|
||||
...state,
|
||||
isRequesting: true,
|
||||
isError: null,
|
||||
};
|
||||
case UPDATE_FORWARD_CLIENT_HEADERS:
|
||||
return {
|
||||
...state,
|
||||
forwardClientHeaders: !state.forwardClientHeaders,
|
||||
};
|
||||
default:
|
||||
return {
|
||||
...state,
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
export {
|
||||
inputChange,
|
||||
addResolver,
|
||||
fetchResolver,
|
||||
deleteResolver,
|
||||
modifyResolver,
|
||||
RESET,
|
||||
TOGGLE_MODIFY,
|
||||
UPDATE_FORWARD_CLIENT_HEADERS,
|
||||
};
|
||||
|
||||
export default addResolverReducer;
|
176
console/src/components/Services/CustomResolver/Common/Common.js
Normal file
@ -0,0 +1,176 @@
|
||||
import React from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import OverlayTrigger from 'react-bootstrap/lib/OverlayTrigger';
|
||||
import Tooltip from 'react-bootstrap/lib/Tooltip';
|
||||
import DropdownButton from './DropdownButton';
|
||||
|
||||
import {
|
||||
inputChange,
|
||||
UPDATE_FORWARD_CLIENT_HEADERS,
|
||||
} from '../Add/addResolverReducer';
|
||||
|
||||
import CommonHeader from '../../Layout/ReusableHeader/Header';
|
||||
|
||||
const graphqlurl = (
|
||||
<Tooltip id="tooltip-cascade">
|
||||
Remote GraphQL server’s URL. E.g. https://my-domain/v1alpha1/graphql
|
||||
</Tooltip>
|
||||
);
|
||||
const clientHeaderForward = (
|
||||
<Tooltip id="tooltip-cascade">
|
||||
Toggle forwarding headers sent by the client app in the request to your
|
||||
remote GraphQL server
|
||||
</Tooltip>
|
||||
);
|
||||
const additionalHeaders = (
|
||||
<Tooltip id="tooltip-cascade">
|
||||
Custom headers to be sent to the remote GraphQL server. E.g. an access key
|
||||
</Tooltip>
|
||||
);
|
||||
const schema = (
|
||||
<Tooltip id="tooltip-cascade">
|
||||
Give this GraphQL schema a friendly name.
|
||||
</Tooltip>
|
||||
);
|
||||
|
||||
class Common extends React.Component {
|
||||
getPlaceHolderText(valType) {
|
||||
if (valType === 'static') {
|
||||
return 'header value';
|
||||
}
|
||||
return 'env var name';
|
||||
}
|
||||
handleInputChange(e) {
|
||||
const fieldName = e.target.getAttribute('data-key');
|
||||
this.props.dispatch(inputChange(fieldName, e.target.value));
|
||||
}
|
||||
toggleUrlParam(e) {
|
||||
const field = e.target.getAttribute('value');
|
||||
this.props.dispatch(inputChange(field, ''));
|
||||
}
|
||||
toggleForwardHeaders() {
|
||||
this.props.dispatch({ type: UPDATE_FORWARD_CLIENT_HEADERS });
|
||||
}
|
||||
render() {
|
||||
const styles = require('../Styles.scss');
|
||||
const { name, manualUrl, envName, forwardClientHeaders } = this.props;
|
||||
const { isModify, id } = this.props.editState;
|
||||
const isDisabled = id >= 0 && !isModify;
|
||||
const urlRequired = !manualUrl && !envName;
|
||||
return (
|
||||
<div className={styles.CommonWrapper}>
|
||||
<div className={styles.subheading_text + ' ' + styles.addPaddTop}>
|
||||
Remote Schema name *
|
||||
<OverlayTrigger placement="right" overlay={schema}>
|
||||
<i className="fa fa-question-circle" aria-hidden="true" />
|
||||
</OverlayTrigger>
|
||||
</div>
|
||||
<label
|
||||
className={
|
||||
styles.inputLabel + ' radio-inline ' + styles.padd_left_remove
|
||||
}
|
||||
>
|
||||
<input
|
||||
className={'form-control'}
|
||||
type="text"
|
||||
placeholder="Name of the schema"
|
||||
value={name}
|
||||
data-key="name"
|
||||
onChange={this.handleInputChange.bind(this)}
|
||||
disabled={isDisabled}
|
||||
required
|
||||
data-test="remote-schema-schema-name"
|
||||
pattern="^[a-zA-Z0-9-_]*$"
|
||||
title="Special characters except '-' or '_' are not allowed"
|
||||
/>
|
||||
</label>
|
||||
<hr />
|
||||
<div className={styles.subheading_text}>
|
||||
GraphQL server URL *
|
||||
<OverlayTrigger placement="right" overlay={graphqlurl}>
|
||||
<i className="fa fa-question-circle" aria-hidden="true" />
|
||||
</OverlayTrigger>
|
||||
</div>
|
||||
<div className={styles.addPaddCommom + ' ' + styles.wd_300}>
|
||||
<DropdownButton
|
||||
dropdownOptions={[
|
||||
{ display_text: 'URL', value: 'manualUrl' },
|
||||
{ display_text: 'From env var', value: 'envName' },
|
||||
]}
|
||||
title={
|
||||
(manualUrl !== null && 'URL') ||
|
||||
(envName !== null && 'From env var') ||
|
||||
'Value'
|
||||
}
|
||||
dataKey={
|
||||
(manualUrl !== null && 'manualUrl') ||
|
||||
(envName !== null && 'envName')
|
||||
}
|
||||
onButtonChange={this.toggleUrlParam.bind(this)}
|
||||
onInputChange={this.handleInputChange.bind(this)}
|
||||
required={urlRequired}
|
||||
bsClass={styles.dropdown_button}
|
||||
inputVal={manualUrl || envName}
|
||||
disabled={isDisabled}
|
||||
id="graphql-server-url"
|
||||
inputPlaceHolder={
|
||||
(manualUrl !== null &&
|
||||
'https://my-graphql-service.com/graphql') ||
|
||||
(envName !== null && 'MY_GRAPHQL_ENDPOINT')
|
||||
}
|
||||
testId="remote-schema-graphql-url"
|
||||
/>
|
||||
</div>
|
||||
<div className={styles.subheading_text + ' ' + styles.addPaddTop}>
|
||||
Headers for the remote GraphQL server
|
||||
</div>
|
||||
<div className={styles.check_box}>
|
||||
<label>
|
||||
<input
|
||||
onChange={this.toggleForwardHeaders.bind(this)}
|
||||
className={styles.display_inline + ' ' + styles.add_mar_right}
|
||||
type="checkbox"
|
||||
value="forwardHeaders"
|
||||
data-test="forward-remote-schema-headers"
|
||||
checked={forwardClientHeaders}
|
||||
disabled={isDisabled}
|
||||
/>
|
||||
<span>Forward all headers from client</span>
|
||||
</label>
|
||||
<OverlayTrigger placement="right" overlay={clientHeaderForward}>
|
||||
<i className="fa fa-question-circle" aria-hidden="true" />
|
||||
</OverlayTrigger>
|
||||
</div>
|
||||
<div className={styles.subheading_text + ' ' + styles.font_normal}>
|
||||
Additional headers:
|
||||
<OverlayTrigger placement="right" overlay={additionalHeaders}>
|
||||
<i className="fa fa-question-circle" aria-hidden="true" />
|
||||
</OverlayTrigger>
|
||||
</div>
|
||||
<CommonHeader
|
||||
eventPrefix="CUSTOM_RESOLVER"
|
||||
headers={this.props.headers}
|
||||
dispatch={this.props.dispatch}
|
||||
typeOptions={[
|
||||
{ display_text: 'Value', value: 'static' },
|
||||
{ display_text: 'From env var', value: 'env' },
|
||||
]}
|
||||
isDisabled={isDisabled}
|
||||
placeHolderText={this.getPlaceHolderText.bind(this)}
|
||||
keyInputPlaceholder="header name"
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
Common.propTypes = {
|
||||
name: PropTypes.string.isRequired,
|
||||
envName: PropTypes.string.isRequired,
|
||||
manualUrl: PropTypes.string.isRequired,
|
||||
headers: PropTypes.array.isRequired,
|
||||
forwardClientHeaders: PropTypes.bool.isRequired,
|
||||
dispatch: PropTypes.func.isRequired,
|
||||
};
|
||||
|
||||
export default Common;
|
@ -0,0 +1,83 @@
|
||||
import React from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
|
||||
import InputGroup from 'react-bootstrap/lib/InputGroup';
|
||||
import DropdownButton from 'react-bootstrap/lib/DropdownButton';
|
||||
import MenuItem from 'react-bootstrap/lib/MenuItem';
|
||||
|
||||
class DropButton extends React.Component {
|
||||
render() {
|
||||
const {
|
||||
title,
|
||||
dropdownOptions,
|
||||
value,
|
||||
required,
|
||||
onInputChange,
|
||||
onButtonChange,
|
||||
dataKey,
|
||||
dataIndex,
|
||||
bsClass,
|
||||
disabled,
|
||||
inputVal,
|
||||
inputPlaceHolder,
|
||||
id,
|
||||
testId,
|
||||
} = this.props;
|
||||
return (
|
||||
<InputGroup className={bsClass}>
|
||||
<DropdownButton
|
||||
title={value || title}
|
||||
componentClass={InputGroup.Button}
|
||||
disabled={disabled}
|
||||
id={id}
|
||||
data-test={testId + '-' + 'dropdown-button'}
|
||||
>
|
||||
{dropdownOptions.map((d, i) => (
|
||||
<MenuItem
|
||||
data-index-id={dataIndex}
|
||||
value={d.value}
|
||||
onClick={onButtonChange}
|
||||
eventKey={i + 1}
|
||||
key={i}
|
||||
data-test={testId + '-' + 'dropdown-item' + '-' + (i + 1)}
|
||||
>
|
||||
{d.display_text}
|
||||
</MenuItem>
|
||||
))}
|
||||
</DropdownButton>
|
||||
<input
|
||||
type="text"
|
||||
data-key={dataKey}
|
||||
data-index-id={dataIndex}
|
||||
className={'form-control'}
|
||||
required={required}
|
||||
onChange={onInputChange}
|
||||
disabled={disabled}
|
||||
value={inputVal || ''}
|
||||
placeholder={inputPlaceHolder}
|
||||
data-test={testId + '-' + 'input'}
|
||||
/>
|
||||
</InputGroup>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
DropButton.propTypes = {
|
||||
dispatch: PropTypes.func.isRequired,
|
||||
dropdownOptions: PropTypes.array.isRequired,
|
||||
title: PropTypes.string.isRequired,
|
||||
value: PropTypes.string.isRequired,
|
||||
dataKey: PropTypes.string.isRequired,
|
||||
dataIndex: PropTypes.string.isRequired,
|
||||
inputVal: PropTypes.string.isRequired,
|
||||
inputPlaceHolder: PropTypes.string,
|
||||
required: PropTypes.bool.isRequired,
|
||||
onButtonChange: PropTypes.func.isRequired,
|
||||
onInputChange: PropTypes.func.isRequired,
|
||||
bsClass: PropTypes.string,
|
||||
id: PropTypes.string,
|
||||
testId: PropTypes.string,
|
||||
disabled: PropTypes.bool.isRequired,
|
||||
};
|
||||
|
||||
export default DropButton;
|
@ -0,0 +1,163 @@
|
||||
import React from 'react';
|
||||
|
||||
import { Route, IndexRedirect, Link } from 'react-router';
|
||||
import { layoutConnector, rightBar } from '../Layout';
|
||||
import globals from '../../../Globals';
|
||||
import {
|
||||
landingCustomResolverGen,
|
||||
addConnector,
|
||||
editConnector,
|
||||
viewConnector,
|
||||
} from '.';
|
||||
import { fetchResolvers, FILTER_RESOLVER } from './customActions';
|
||||
|
||||
// Objective is to render list of custom resolvers on the
|
||||
// left nav bar.
|
||||
// Custom resolvers list is fetched from hdb_catalog/custom_resolver
|
||||
// Whenever any operation happens like add resolver/delete resolver, this state should update automatically.
|
||||
|
||||
import { appPrefix } from './constants';
|
||||
|
||||
const listItem = (dataList, styles, currentLocation, currentResolver) => {
|
||||
if (dataList.length === 0) {
|
||||
return (
|
||||
<li
|
||||
className={styles.noTables}
|
||||
data-test="remote-schema-sidebar-no-schemas"
|
||||
>
|
||||
<i>No remote schemas available</i>
|
||||
</li>
|
||||
);
|
||||
}
|
||||
return dataList.map((d, i) => {
|
||||
let activeTableClass = '';
|
||||
if (
|
||||
d.name === currentResolver &&
|
||||
currentLocation.pathname.indexOf(currentResolver) !== -1
|
||||
) {
|
||||
activeTableClass = styles.activeTable;
|
||||
}
|
||||
return (
|
||||
<li
|
||||
className={activeTableClass}
|
||||
key={i}
|
||||
data-test={`remote-schema-sidebar-links-${i + 1}`}
|
||||
>
|
||||
<Link
|
||||
to={appPrefix + '/manage/' + d.name + '/details'}
|
||||
data-test={d.name}
|
||||
>
|
||||
<i className={styles.tableIcon + ' fa fa-table'} aria-hidden="true" />
|
||||
{d.name}
|
||||
</Link>
|
||||
</li>
|
||||
);
|
||||
});
|
||||
};
|
||||
|
||||
const filterItem = dispatch => {
|
||||
return (dataList, searchVal) => {
|
||||
// form new schema
|
||||
const matchedTables = dataList.filter(data => {
|
||||
return (
|
||||
data.name
|
||||
.toLowerCase()
|
||||
.indexOf(searchVal ? searchVal.toLowerCase() : '') !== -1
|
||||
);
|
||||
});
|
||||
dispatch({
|
||||
type: FILTER_RESOLVER,
|
||||
data: {
|
||||
filtered: matchedTables,
|
||||
searchQuery: searchVal,
|
||||
},
|
||||
});
|
||||
};
|
||||
};
|
||||
|
||||
const leftNavMapStateToProps = state => {
|
||||
return {
|
||||
...state,
|
||||
dataList: [...state.customResolverData.listData.resolvers],
|
||||
isError: state.customResolverData.listData.isError,
|
||||
isRequesting: state.customResolverData.listData.isRequesting,
|
||||
filtered: [...state.customResolverData.listData.filtered],
|
||||
searchQuery: state.customResolverData.listData.searchQuery,
|
||||
viewResolver: state.customResolverData.listData.viewResolver,
|
||||
migrationMode: state.main.migrationMode ? state.main.migrationMode : false,
|
||||
listItemTemplate: listItem,
|
||||
appPrefix,
|
||||
};
|
||||
};
|
||||
|
||||
const leftNavMapDispatchToProps = dispatch => {
|
||||
return {
|
||||
filterItem: filterItem(dispatch),
|
||||
};
|
||||
};
|
||||
|
||||
const fetchInitialData = ({ dispatch }) => {
|
||||
return (nextState, replaceState, cb) => {
|
||||
/*
|
||||
const currState = getState();
|
||||
const dataList = currState.customResolverData.listData.resolvers;
|
||||
if (dataList.length) {
|
||||
cb();
|
||||
return;
|
||||
}
|
||||
*/
|
||||
|
||||
Promise.all([dispatch(fetchResolvers())]).then(
|
||||
() => {
|
||||
cb();
|
||||
},
|
||||
() => {
|
||||
// alert('Could not load schema.');
|
||||
replaceState(globals.urlPrefix);
|
||||
cb();
|
||||
}
|
||||
);
|
||||
};
|
||||
};
|
||||
|
||||
const getCustomResolverRouter = (connect, store, composeOnEnterHooks) => {
|
||||
const migrationRedirects = (nextState, replaceState, cb) => {
|
||||
const state = store.getState();
|
||||
if (!state.main.migrationMode) {
|
||||
replaceState(globals.urlPrefix + appPrefix + '/manage');
|
||||
cb();
|
||||
}
|
||||
cb();
|
||||
};
|
||||
return (
|
||||
<Route
|
||||
path="remote-schemas"
|
||||
component={layoutConnector(
|
||||
connect,
|
||||
leftNavMapStateToProps,
|
||||
leftNavMapDispatchToProps
|
||||
)}
|
||||
onEnter={composeOnEnterHooks([fetchInitialData(store)])}
|
||||
onChange={fetchInitialData(store)}
|
||||
>
|
||||
<IndexRedirect to="manage" />
|
||||
<Route path="manage" component={rightBar(connect)}>
|
||||
<IndexRedirect to="schemas" />
|
||||
<Route path="schemas" component={landingCustomResolverGen(connect)} />
|
||||
<Route
|
||||
path="add"
|
||||
component={addConnector(connect)}
|
||||
onEnter={composeOnEnterHooks([migrationRedirects])}
|
||||
/>
|
||||
<Route
|
||||
path=":resolverName/details"
|
||||
component={viewConnector(connect)}
|
||||
/>
|
||||
<Route path=":resolverName/modify" component={editConnector(connect)} />
|
||||
</Route>
|
||||
</Route>
|
||||
);
|
||||
};
|
||||
|
||||
export default getCustomResolverRouter;
|
||||
export { appPrefix };
|
223
console/src/components/Services/CustomResolver/Edit/Edit.js
Normal file
@ -0,0 +1,223 @@
|
||||
import React from 'react';
|
||||
import Common from '../Common/Common';
|
||||
|
||||
import {
|
||||
fetchResolver,
|
||||
deleteResolver,
|
||||
modifyResolver,
|
||||
RESET,
|
||||
TOGGLE_MODIFY,
|
||||
} from '../Add/addResolverReducer';
|
||||
import { VIEW_RESOLVER } from '../customActions';
|
||||
import { push } from 'react-router-redux';
|
||||
import Helmet from 'react-helmet';
|
||||
import tabInfo from './tabInfo';
|
||||
import CommonTabLayout from '../../Layout/CommonTabLayout/CommonTabLayout';
|
||||
|
||||
import { appPrefix, pageTitle } from '../constants';
|
||||
|
||||
import globals from '../../../../Globals';
|
||||
|
||||
const prefixUrl = globals.urlPrefix + appPrefix;
|
||||
|
||||
class Edit extends React.Component {
|
||||
constructor() {
|
||||
super();
|
||||
this.editClicked = this.editClicked.bind(this);
|
||||
this.modifyClick = this.modifyClick.bind(this);
|
||||
this.handleDeleteResolver = this.handleDeleteResolver.bind(this);
|
||||
this.handleCancelModify = this.handleCancelModify.bind(this);
|
||||
|
||||
this.state = {};
|
||||
this.state.deleteConfirmationError = null;
|
||||
}
|
||||
componentDidMount() {
|
||||
const { resolverName } = this.props.params;
|
||||
if (!resolverName) {
|
||||
this.props.dispatch(push(prefixUrl));
|
||||
}
|
||||
Promise.all([
|
||||
this.props.dispatch(fetchResolver(resolverName)),
|
||||
this.props.dispatch({ type: VIEW_RESOLVER, data: resolverName }),
|
||||
]);
|
||||
}
|
||||
componentWillReceiveProps(nextProps) {
|
||||
if (nextProps.params.resolverName !== this.props.params.resolverName) {
|
||||
Promise.all([
|
||||
this.props.dispatch(fetchResolver(nextProps.params.resolverName)),
|
||||
this.props.dispatch({
|
||||
type: VIEW_RESOLVER,
|
||||
data: nextProps.params.resolverName,
|
||||
}),
|
||||
]);
|
||||
}
|
||||
}
|
||||
componentWillUnmount() {
|
||||
Promise.all([
|
||||
this.props.dispatch({ type: RESET }),
|
||||
this.props.dispatch({ type: VIEW_RESOLVER, data: '' }),
|
||||
]);
|
||||
}
|
||||
|
||||
handleDeleteResolver(e) {
|
||||
e.preventDefault();
|
||||
const a = prompt(
|
||||
'Are you absolutely sure?\nThis action cannot be undone. This will permanently delete stitched GraphQL schema. Please type "DELETE" (in caps, without quotes) to confirm.\n'
|
||||
);
|
||||
try {
|
||||
if (a && typeof a === 'string' && a.trim() === 'DELETE') {
|
||||
this.updateDeleteConfirmationError(null);
|
||||
this.props.dispatch(deleteResolver());
|
||||
} else {
|
||||
// Input didn't match
|
||||
// Show an error message right next to the button
|
||||
this.updateDeleteConfirmationError('user confirmation error!');
|
||||
}
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
}
|
||||
}
|
||||
updateDeleteConfirmationError(data) {
|
||||
this.setState({ ...this.state, deleteConfirmationError: data });
|
||||
}
|
||||
modifyClick() {
|
||||
this.props.dispatch({ type: TOGGLE_MODIFY });
|
||||
}
|
||||
handleCancelModify() {
|
||||
this.props.dispatch({ type: TOGGLE_MODIFY });
|
||||
}
|
||||
editClicked() {
|
||||
this.props.dispatch(modifyResolver());
|
||||
}
|
||||
render() {
|
||||
const styles = require('../Styles.scss');
|
||||
const { isFetching, isRequesting, editState, migrationMode } = this.props;
|
||||
const { resolverName } = this.props.params;
|
||||
|
||||
const generateMigrateBtns = () => {
|
||||
return 'isModify' in editState && !editState.isModify ? (
|
||||
<div className={styles.commonBtn}>
|
||||
<button
|
||||
className={styles.yellow_button}
|
||||
onClick={e => {
|
||||
e.preventDefault();
|
||||
this.modifyClick();
|
||||
}}
|
||||
data-test={'remote-schema-edit-modify-btn'}
|
||||
disabled={isRequesting}
|
||||
>
|
||||
Modify
|
||||
</button>
|
||||
<button
|
||||
className={styles.danger_button + ' btn-danger'}
|
||||
onClick={e => {
|
||||
e.preventDefault();
|
||||
this.handleDeleteResolver(e);
|
||||
}}
|
||||
disabled={isRequesting}
|
||||
data-test={'remote-schema-edit-delete-btn'}
|
||||
>
|
||||
{isRequesting ? 'Deleting ...' : 'Delete'}
|
||||
</button>
|
||||
{this.state.deleteConfirmationError ? (
|
||||
<span
|
||||
className={styles.delete_confirmation_error}
|
||||
data-test="delete-confirmation-error"
|
||||
>
|
||||
* {this.state.deleteConfirmationError}
|
||||
</span>
|
||||
) : null}
|
||||
</div>
|
||||
) : (
|
||||
<div className={styles.commonBtn}>
|
||||
<button
|
||||
className={styles.yellow_button}
|
||||
type="submit"
|
||||
disabled={isRequesting}
|
||||
data-test={'remote-schema-edit-save-btn'}
|
||||
>
|
||||
{isRequesting ? 'Saving' : 'Save'}
|
||||
</button>
|
||||
<button
|
||||
className={styles.default_button}
|
||||
onClick={e => {
|
||||
e.preventDefault();
|
||||
this.handleCancelModify();
|
||||
}}
|
||||
data-test={'remote-schema-edit-cancel-btn'}
|
||||
disabled={isRequesting}
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
const breadCrumbs = [
|
||||
{
|
||||
title: 'Remote schemas',
|
||||
url: appPrefix,
|
||||
},
|
||||
{
|
||||
title: 'Manage',
|
||||
url: appPrefix + '/' + 'manage',
|
||||
},
|
||||
];
|
||||
|
||||
if (resolverName) {
|
||||
breadCrumbs.push({
|
||||
title: resolverName.trim(),
|
||||
url:
|
||||
appPrefix +
|
||||
'/' +
|
||||
'manage' +
|
||||
'/' +
|
||||
resolverName.trim() +
|
||||
'/' +
|
||||
'details',
|
||||
});
|
||||
breadCrumbs.push({
|
||||
title: 'modify',
|
||||
url: '',
|
||||
});
|
||||
}
|
||||
|
||||
return (
|
||||
<div className={styles.addWrapper}>
|
||||
<Helmet
|
||||
title={`Edit ${pageTitle} - ${resolverName} - ${pageTitle}s | Hasura`}
|
||||
/>
|
||||
<CommonTabLayout
|
||||
appPrefix={appPrefix}
|
||||
currentTab="modify"
|
||||
heading={resolverName}
|
||||
tabsInfo={tabInfo}
|
||||
breadCrumbs={breadCrumbs}
|
||||
baseUrl={`${appPrefix}/manage/${resolverName}`}
|
||||
showLoader={isFetching}
|
||||
/>
|
||||
{isFetching ? null : (
|
||||
<form
|
||||
onSubmit={e => {
|
||||
e.preventDefault();
|
||||
this.editClicked();
|
||||
}}
|
||||
>
|
||||
<Common {...this.props} />
|
||||
{migrationMode ? generateMigrateBtns() : null}
|
||||
</form>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
const mapStateToProps = state => {
|
||||
return {
|
||||
...state.customResolverData.addData,
|
||||
...state.customResolverData.headerData,
|
||||
migrationMode: state.main.migrationMode,
|
||||
dataHeaders: { ...state.tables.dataHeaders },
|
||||
};
|
||||
};
|
||||
|
||||
export default connect => connect(mapStateToProps)(Edit);
|
169
console/src/components/Services/CustomResolver/Edit/View.js
Normal file
@ -0,0 +1,169 @@
|
||||
import React from 'react';
|
||||
|
||||
import CommonTabLayout from '../../Layout/CommonTabLayout/CommonTabLayout';
|
||||
import tabInfo from './tabInfo';
|
||||
import Tooltip from 'react-bootstrap/lib/Tooltip';
|
||||
import OverlayTrigger from 'react-bootstrap/lib/OverlayTrigger';
|
||||
import { push } from 'react-router-redux';
|
||||
|
||||
import { fetchResolver, RESET } from '../Add/addResolverReducer';
|
||||
|
||||
import { VIEW_RESOLVER } from '../customActions';
|
||||
import ReloadMetadata from '../../Data/Metadata/ReloadMetadata';
|
||||
|
||||
import { appPrefix } from '../constants';
|
||||
|
||||
import globals from '../../../../Globals';
|
||||
|
||||
const prefixUrl = globals.urlPrefix + appPrefix;
|
||||
|
||||
const refresh = (
|
||||
<Tooltip id="tooltip-cascade">
|
||||
If your remote schema has changed, you need to refresh the GraphQL Engine
|
||||
metadata to query the modified schema
|
||||
</Tooltip>
|
||||
);
|
||||
|
||||
class ViewStitchedSchema extends React.Component {
|
||||
componentDidMount() {
|
||||
const { resolverName } = this.props.params;
|
||||
if (!resolverName) {
|
||||
this.props.dispatch(push(prefixUrl));
|
||||
}
|
||||
Promise.all([
|
||||
this.props.dispatch(fetchResolver(resolverName)),
|
||||
this.props.dispatch({ type: VIEW_RESOLVER, data: resolverName }),
|
||||
]);
|
||||
}
|
||||
componentWillReceiveProps(nextProps) {
|
||||
if (nextProps.params.resolverName !== this.props.params.resolverName) {
|
||||
Promise.all([
|
||||
this.props.dispatch(fetchResolver(nextProps.params.resolverName)),
|
||||
this.props.dispatch({
|
||||
type: VIEW_RESOLVER,
|
||||
data: nextProps.params.resolverName,
|
||||
}),
|
||||
]);
|
||||
}
|
||||
}
|
||||
componentWillUnmount() {
|
||||
Promise.all([
|
||||
this.props.dispatch({ type: RESET }),
|
||||
this.props.dispatch({ type: VIEW_RESOLVER, data: '' }),
|
||||
]);
|
||||
}
|
||||
render() {
|
||||
const styles = require('../Styles.scss');
|
||||
const { resolverName } = this.props.params;
|
||||
const { manualUrl, envName, headers } = this.props;
|
||||
const filterHeaders = headers.filter(h => !!h.name);
|
||||
const breadCrumbs = [
|
||||
{
|
||||
title: 'Remote schemas',
|
||||
url: appPrefix,
|
||||
},
|
||||
{
|
||||
title: 'Manage',
|
||||
url: appPrefix + '/' + 'manage',
|
||||
},
|
||||
];
|
||||
|
||||
if (resolverName) {
|
||||
breadCrumbs.push({
|
||||
title: resolverName.trim(),
|
||||
url:
|
||||
appPrefix +
|
||||
'/' +
|
||||
'manage' +
|
||||
'/' +
|
||||
resolverName.trim() +
|
||||
'/' +
|
||||
'details',
|
||||
});
|
||||
breadCrumbs.push({
|
||||
title: 'details',
|
||||
url: '',
|
||||
});
|
||||
}
|
||||
return (
|
||||
<div
|
||||
className={styles.view_stitch_schema_wrapper + ' ' + styles.addWrapper}
|
||||
>
|
||||
<CommonTabLayout
|
||||
appPrefix={appPrefix}
|
||||
currentTab="details"
|
||||
heading={resolverName}
|
||||
tabsInfo={tabInfo}
|
||||
breadCrumbs={breadCrumbs}
|
||||
baseUrl={`${appPrefix}/manage/${resolverName}`}
|
||||
/>
|
||||
<br />
|
||||
<div>
|
||||
<div className={styles.detailsSection}>
|
||||
<table className="table table-striped table-bordered">
|
||||
<thead />
|
||||
<tbody>
|
||||
<tr>
|
||||
<td>GraphQL Server URL</td>
|
||||
<td>{manualUrl || `<${envName}>`}</td>
|
||||
</tr>
|
||||
{filterHeaders.length > 0 ? (
|
||||
<tr>
|
||||
<td>Headers</td>
|
||||
<td>
|
||||
{filterHeaders &&
|
||||
filterHeaders.filter(k => !!k.name).map((h, i) => [
|
||||
<tr key={i}>
|
||||
<td>
|
||||
{h.name} :{' '}
|
||||
{h.type === 'static'
|
||||
? h.value
|
||||
: '<' + h.value + '>'}
|
||||
</td>
|
||||
</tr>,
|
||||
i !== filterHeaders.length - 1 ? <hr /> : null,
|
||||
])}
|
||||
</td>
|
||||
</tr>
|
||||
) : null}
|
||||
{/*
|
||||
<tr>
|
||||
<td>Webhook</td>
|
||||
<td>in-use/bypassed</td>
|
||||
</tr>
|
||||
*/}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<div className={styles.commonBtn + ' ' + styles.detailsRefreshButton}>
|
||||
<span>
|
||||
<ReloadMetadata
|
||||
{...this.props}
|
||||
btnText={'Refresh schema'}
|
||||
btnTextChanging={'Refreshing schema...'}
|
||||
bsClass={styles.yellow_button}
|
||||
/>
|
||||
</span>
|
||||
<span>
|
||||
<OverlayTrigger placement="right" overlay={refresh}>
|
||||
<i className="fa fa-question-circle" aria-hidden="true" />
|
||||
</OverlayTrigger>
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
<br />
|
||||
<br />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const mapStateToProps = state => {
|
||||
return {
|
||||
...state.customResolverData.addData,
|
||||
...state.customResolverData.headerData,
|
||||
dataHeaders: { ...state.tables.dataHeaders },
|
||||
};
|
||||
};
|
||||
|
||||
export default connect => connect(mapStateToProps)(ViewStitchedSchema);
|
@ -0,0 +1,10 @@
|
||||
const tabInfo = {
|
||||
details: {
|
||||
display_text: 'Details',
|
||||
},
|
||||
modify: {
|
||||
display_text: 'Modify',
|
||||
},
|
||||
};
|
||||
|
||||
export default tabInfo;
|
@ -0,0 +1,84 @@
|
||||
import React from 'react';
|
||||
import Helmet from 'react-helmet';
|
||||
import { push } from 'react-router-redux';
|
||||
|
||||
import { appPrefix, pageTitle } from '../constants';
|
||||
import globals from '../../../../Globals';
|
||||
|
||||
class CustomResolver extends React.Component {
|
||||
render() {
|
||||
const styles = require('../Styles.scss');
|
||||
// const landingImage = require('./schema-stitching-color.png');
|
||||
// const landingImage = 'https://storage.googleapis.com/hasura-graphql-engine/console/assets/schema-stitching-diagram.png';
|
||||
|
||||
const { dispatch, migrationMode } = this.props;
|
||||
return (
|
||||
<div
|
||||
className={`${styles.padd_left_remove} ${
|
||||
styles.resolverWrapper
|
||||
} container-fluid ${styles.padd_top}`}
|
||||
>
|
||||
<div className={styles.padd_left}>
|
||||
<Helmet title={`${pageTitle}s | Hasura`} />
|
||||
<div>
|
||||
<h2 className={`${styles.heading_text} ${styles.inline_block}`}>
|
||||
Remote Schemas
|
||||
</h2>
|
||||
{migrationMode ? (
|
||||
<button
|
||||
data-test="data-create-remote-schemas"
|
||||
className={styles.yellow_button}
|
||||
onClick={e => {
|
||||
e.preventDefault();
|
||||
dispatch(push(`${globals.urlPrefix}${appPrefix}/manage/add`));
|
||||
}}
|
||||
>
|
||||
Add
|
||||
</button>
|
||||
) : null}
|
||||
<hr />
|
||||
</div>
|
||||
{/*
|
||||
<div className={styles.resolverContent}>
|
||||
Add pre-CRUD custom business logic like data validation, etc. or also
|
||||
fetch data from another GraphQL server by stitching schemas
|
||||
</div>
|
||||
<div className={styles.resolverImg}>
|
||||
<img src={landingImage} />
|
||||
</div>
|
||||
<div className={styles.commonBtn}>
|
||||
<Link
|
||||
className={styles.padd_remove_full}
|
||||
to={`${appPrefix}/manage/add`}
|
||||
>
|
||||
<button className={styles.yellow_button}>
|
||||
Add Remote GraphQL schema
|
||||
</button>
|
||||
</Link>
|
||||
</div>
|
||||
<div className={styles.readMore}>
|
||||
<a
|
||||
href="https://docs.hasura.io/1.0/graphql/manual/schema/custom-logic/index.html"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
>
|
||||
Read more
|
||||
</a>
|
||||
</div>
|
||||
*/}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const mapStateToProps = state => {
|
||||
return {
|
||||
migrationMode: state.main.migrationMode,
|
||||
};
|
||||
};
|
||||
|
||||
const landingCustomResolverGen = connect =>
|
||||
connect(mapStateToProps)(CustomResolver);
|
||||
|
||||
export default landingCustomResolverGen;
|
After Width: | Height: | Size: 118 KiB |
142
console/src/components/Services/CustomResolver/Styles.scss
Normal file
@ -0,0 +1,142 @@
|
||||
@import "../../Common/Common.scss";
|
||||
|
||||
.addPaddCommom
|
||||
{
|
||||
padding: 10px 0;
|
||||
}
|
||||
.wd_300 {
|
||||
width: 300px;
|
||||
}
|
||||
.addPaddTop
|
||||
{
|
||||
padding-top: 20px;
|
||||
}
|
||||
.dropdown_button {
|
||||
width: 300px;
|
||||
}
|
||||
.detailsSection {
|
||||
width: 80%;
|
||||
margin-top: 20px;
|
||||
}
|
||||
.detailsRefreshButton {
|
||||
padding-top: 10px !important;
|
||||
}
|
||||
.resolverWrapper
|
||||
{
|
||||
// padding: 10px 0;
|
||||
.resolverContent
|
||||
{
|
||||
text-align: center;
|
||||
padding-bottom: 10px;
|
||||
font-weight: 600;
|
||||
}
|
||||
.resolverImg
|
||||
{
|
||||
width: 100%;
|
||||
padding: 20px;
|
||||
text-align: center;
|
||||
// height: 200px;
|
||||
// border: 1px solid #000;
|
||||
img {
|
||||
}
|
||||
}
|
||||
.commonBtn
|
||||
{
|
||||
text-align: center;
|
||||
padding: 20px 0;
|
||||
padding-bottom: 10px
|
||||
}
|
||||
.readMore
|
||||
{
|
||||
padding-bottom: 10px;
|
||||
text-align: center;
|
||||
}
|
||||
}
|
||||
.CommonWrapper
|
||||
{
|
||||
.check_box {
|
||||
margin-bottom: 20px;
|
||||
label {
|
||||
cursor: pointer;
|
||||
margin-right: 10px;
|
||||
}
|
||||
input {
|
||||
margin: 0;
|
||||
cursor: pointer;
|
||||
}
|
||||
i {
|
||||
cursor: pointer;
|
||||
}
|
||||
}
|
||||
.font_normal {
|
||||
font-weight: normal;
|
||||
}
|
||||
.subheading_text
|
||||
{
|
||||
i
|
||||
{
|
||||
cursor: pointer;
|
||||
}
|
||||
}
|
||||
|
||||
.defaultWidth
|
||||
{
|
||||
width: 300px;
|
||||
}
|
||||
.radioLabel
|
||||
{
|
||||
padding-right: 20px;
|
||||
text-align: left;
|
||||
padding-top: 5px;
|
||||
.radioInput
|
||||
{
|
||||
cursor: pointer;
|
||||
}
|
||||
}
|
||||
.inputLabel
|
||||
{
|
||||
margin-left: 0;
|
||||
input
|
||||
{
|
||||
width: 300px;
|
||||
}
|
||||
}
|
||||
}
|
||||
.addWrapper
|
||||
{
|
||||
.commonBtn
|
||||
{
|
||||
padding: 20px 0;
|
||||
padding-top: 40px;
|
||||
.delete_confirmation_error {
|
||||
margin-left: 15px;
|
||||
color: #d9534f;
|
||||
}
|
||||
.yellow_button
|
||||
{
|
||||
margin-right: 20px;
|
||||
}
|
||||
a
|
||||
{
|
||||
margin-left: 20px;
|
||||
}
|
||||
.refresh_schema_btn {
|
||||
margin-left: 20px;
|
||||
}
|
||||
span
|
||||
{
|
||||
i
|
||||
{
|
||||
cursor: pointer;
|
||||
margin-left: 10px;
|
||||
}
|
||||
}
|
||||
}
|
||||
.remove_padding_bottom {
|
||||
padding-bottom: 0px;
|
||||
}
|
||||
.set_line_height {
|
||||
line-height: 26px;
|
||||
}
|
||||
|
||||
}
|
@ -0,0 +1,4 @@
|
||||
const appPrefix = '/remote-schemas';
|
||||
const pageTitle = 'Remote Schema';
|
||||
|
||||
export { appPrefix, pageTitle };
|
176
console/src/components/Services/CustomResolver/customActions.js
Normal file
@ -0,0 +1,176 @@
|
||||
/* */
|
||||
import { listState } from './state';
|
||||
/* */
|
||||
|
||||
import Endpoints, { globalCookiePolicy } from '../../../Endpoints';
|
||||
import requestAction from '../../../utils/requestAction';
|
||||
import dataHeaders from '../Data/Common/Headers';
|
||||
import globals from '../../../Globals';
|
||||
import returnMigrateUrl from '../Data/Common/getMigrateUrl';
|
||||
import { SERVER_CONSOLE_MODE } from '../../../constants';
|
||||
import { loadMigrationStatus } from '../../Main/Actions';
|
||||
import { handleMigrationErrors } from '../EventTrigger/EventActions';
|
||||
|
||||
import { showSuccessNotification } from '../Data/Notification';
|
||||
|
||||
/* Action constants */
|
||||
|
||||
const FETCH_RESOLVERS = '@customResolver/FETCH_RESOLVERS';
|
||||
const RESOLVERS_FETCH_SUCCESS = '@customResolver/RESOLVERS_FETCH_SUCCESS';
|
||||
const FILTER_RESOLVER = '@customResolver/FILTER_RESOLVER';
|
||||
const RESOLVERS_FETCH_FAIL = '@customResolver/RESOLVERS_FETCH_FAIL';
|
||||
const RESET = '@customResolver/RESET';
|
||||
|
||||
const VIEW_RESOLVER = '@customResolver/VIEW_RESOLVER';
|
||||
|
||||
/* */
|
||||
|
||||
const fetchResolvers = () => {
|
||||
return (dispatch, getState) => {
|
||||
const url = Endpoints.getSchema;
|
||||
const options = {
|
||||
credentials: globalCookiePolicy,
|
||||
method: 'POST',
|
||||
headers: dataHeaders(getState),
|
||||
body: JSON.stringify({
|
||||
type: 'select',
|
||||
args: {
|
||||
table: {
|
||||
name: 'remote_schemas',
|
||||
schema: 'hdb_catalog',
|
||||
},
|
||||
columns: ['*'],
|
||||
order_by: [{ column: 'name', type: 'asc', nulls: 'last' }],
|
||||
},
|
||||
}),
|
||||
};
|
||||
dispatch({ type: FETCH_RESOLVERS });
|
||||
return dispatch(requestAction(url, options)).then(
|
||||
data => {
|
||||
dispatch({ type: RESOLVERS_FETCH_SUCCESS, data: data });
|
||||
return Promise.resolve();
|
||||
},
|
||||
error => {
|
||||
console.error('Failed to load triggers' + JSON.stringify(error));
|
||||
dispatch({ type: RESOLVERS_FETCH_FAIL, data: error });
|
||||
return Promise.reject();
|
||||
}
|
||||
);
|
||||
};
|
||||
};
|
||||
|
||||
const listReducer = (state = listState, action) => {
|
||||
switch (action.type) {
|
||||
case FETCH_RESOLVERS:
|
||||
return {
|
||||
...state,
|
||||
isRequesting: true,
|
||||
isError: false,
|
||||
};
|
||||
|
||||
case RESOLVERS_FETCH_SUCCESS:
|
||||
return {
|
||||
...state,
|
||||
resolvers: action.data,
|
||||
isRequesting: false,
|
||||
isError: false,
|
||||
};
|
||||
|
||||
case RESOLVERS_FETCH_FAIL:
|
||||
return {
|
||||
...state,
|
||||
resolvers: [],
|
||||
isRequesting: false,
|
||||
isError: action.data,
|
||||
};
|
||||
case FILTER_RESOLVER:
|
||||
return {
|
||||
...state,
|
||||
...action.data,
|
||||
};
|
||||
case RESET:
|
||||
return {
|
||||
...listState,
|
||||
};
|
||||
case VIEW_RESOLVER:
|
||||
return {
|
||||
...state,
|
||||
viewResolver: action.data,
|
||||
};
|
||||
default:
|
||||
return {
|
||||
...state,
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
/* makeRequest function to identify what the current mode is and send normal query or a migration call */
|
||||
const makeRequest = (
|
||||
upQueries,
|
||||
downQueries,
|
||||
migrationName,
|
||||
customOnSuccess,
|
||||
customOnError,
|
||||
requestMsg,
|
||||
successMsg,
|
||||
errorMsg
|
||||
) => {
|
||||
return (dispatch, getState) => {
|
||||
const upQuery = {
|
||||
type: 'bulk',
|
||||
args: upQueries,
|
||||
};
|
||||
|
||||
const downQuery = {
|
||||
type: 'bulk',
|
||||
args: downQueries,
|
||||
};
|
||||
|
||||
const migrationBody = {
|
||||
name: migrationName,
|
||||
up: upQuery.args,
|
||||
down: downQuery.args,
|
||||
};
|
||||
|
||||
const currMigrationMode = getState().main.migrationMode;
|
||||
|
||||
const migrateUrl = returnMigrateUrl(currMigrationMode);
|
||||
|
||||
let finalReqBody;
|
||||
if (globals.consoleMode === SERVER_CONSOLE_MODE) {
|
||||
finalReqBody = upQuery;
|
||||
} else if (globals.consoleMode === 'cli') {
|
||||
finalReqBody = migrationBody;
|
||||
}
|
||||
const url = migrateUrl;
|
||||
const options = {
|
||||
method: 'POST',
|
||||
credentials: globalCookiePolicy,
|
||||
headers: dataHeaders(getState),
|
||||
body: JSON.stringify(finalReqBody),
|
||||
};
|
||||
|
||||
const onSuccess = data => {
|
||||
if (globals.consoleMode === 'cli') {
|
||||
dispatch(loadMigrationStatus()); // don't call for server mode
|
||||
}
|
||||
// dispatch(loadTriggers());
|
||||
if (successMsg) {
|
||||
dispatch(showSuccessNotification(successMsg));
|
||||
}
|
||||
customOnSuccess(data);
|
||||
};
|
||||
|
||||
const onError = err => {
|
||||
dispatch(handleMigrationErrors(errorMsg, err));
|
||||
customOnError(err);
|
||||
};
|
||||
|
||||
dispatch(showSuccessNotification(requestMsg));
|
||||
return dispatch(requestAction(url, options)).then(onSuccess, onError);
|
||||
};
|
||||
};
|
||||
/* */
|
||||
|
||||
export { fetchResolvers, FILTER_RESOLVER, VIEW_RESOLVER, makeRequest };
|
||||
export default listReducer;
|
@ -0,0 +1,19 @@
|
||||
import { combineReducers } from 'redux';
|
||||
|
||||
import listReducer from './customActions';
|
||||
import addReducer from './Add/addResolverReducer';
|
||||
import headerReducer from '../Layout/ReusableHeader/HeaderReducer';
|
||||
|
||||
const customResolverReducer = combineReducers({
|
||||
addData: addReducer,
|
||||
listData: listReducer,
|
||||
headerData: headerReducer('CUSTOM_RESOLVER', [
|
||||
{
|
||||
name: '',
|
||||
type: 'static',
|
||||
value: '',
|
||||
},
|
||||
]),
|
||||
});
|
||||
|
||||
export default customResolverReducer;
|
8
console/src/components/Services/CustomResolver/index.js
Normal file
@ -0,0 +1,8 @@
|
||||
export landingCustomResolverGen from './Landing/CustomResolver';
|
||||
|
||||
export getCustomResolverRouter from './CustomResolverRouter';
|
||||
export addConnector from './Add/Add';
|
||||
export customResolverReducer from './customResolverReducer';
|
||||
|
||||
export editConnector from './Edit/Edit';
|
||||
export viewConnector from './Edit/View';
|
34
console/src/components/Services/CustomResolver/state.js
Normal file
@ -0,0 +1,34 @@
|
||||
const asyncState = {
|
||||
isRequesting: false,
|
||||
isError: false,
|
||||
isFetching: false,
|
||||
isFetchError: null,
|
||||
};
|
||||
|
||||
const listState = {
|
||||
resolvers: [],
|
||||
filtered: [],
|
||||
searchQuery: '',
|
||||
viewResolver: '',
|
||||
...asyncState,
|
||||
};
|
||||
|
||||
const addState = {
|
||||
manualUrl: '',
|
||||
envName: null,
|
||||
headers: [],
|
||||
name: '',
|
||||
forwardClientHeaders: false,
|
||||
...asyncState,
|
||||
editState: {
|
||||
id: -1,
|
||||
isModify: false,
|
||||
originalName: '',
|
||||
originalHeaders: [],
|
||||
originalUrl: '',
|
||||
originalEnvUrl: '',
|
||||
originalForwardClientHeaders: false,
|
||||
},
|
||||
};
|
||||
|
||||
export { listState, addState };
|
@ -20,7 +20,7 @@ class ReloadMetadata extends Component {
|
||||
<div className={metaDataStyles.display_inline}>
|
||||
<button
|
||||
data-test="data-reload-metadata"
|
||||
className={styles.default_button + ' ' + metaDataStyles.margin_right}
|
||||
className={this.props.bsClass || styles.default_button}
|
||||
onClick={e => {
|
||||
e.preventDefault();
|
||||
this.setState({ isReloading: true });
|
||||
@ -72,7 +72,9 @@ class ReloadMetadata extends Component {
|
||||
});
|
||||
}}
|
||||
>
|
||||
{this.state.isReloading ? 'Reloading...' : 'Reload'}
|
||||
{this.state.isReloading
|
||||
? this.props.btnTextChanging || 'Reloading...'
|
||||
: this.props.btnText || 'Reload'}
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
|
@ -22,7 +22,7 @@ class ResetMetadata extends Component {
|
||||
<div className={metaDataStyles.display_inline}>
|
||||
<button
|
||||
data-test="data-reset-metadata"
|
||||
className={styles.default_button + ' ' + metaDataStyles.margin_right}
|
||||
className={styles.default_button}
|
||||
onClick={e => {
|
||||
e.preventDefault();
|
||||
const a = prompt(
|
||||
|
@ -0,0 +1,45 @@
|
||||
import React from 'react';
|
||||
import PropTypes from 'prop-types';
|
||||
import { Link } from 'react-router';
|
||||
|
||||
class BreadCrumb extends React.Component {
|
||||
render() {
|
||||
const { breadCrumbs } = this.props;
|
||||
const styles = require('../../EventTrigger/TableCommon/Table.scss');
|
||||
const bC =
|
||||
breadCrumbs && breadCrumbs.length > 0
|
||||
? breadCrumbs.map((b, i) => {
|
||||
const Sp = () => {
|
||||
const space = ' ';
|
||||
return space;
|
||||
};
|
||||
const addArrow = () => [
|
||||
<Sp key={'breadcrumb-space-before' + i} />,
|
||||
<i
|
||||
key={'l' + i}
|
||||
className="fa fa-angle-right"
|
||||
aria-hidden="true"
|
||||
/>,
|
||||
<Sp key={'breadcrumb-space-after' + i} />,
|
||||
];
|
||||
if (i !== breadCrumbs.length - 1) {
|
||||
return [
|
||||
<Link key={'l' + i} to={`${b.url}`}>
|
||||
{b.title}
|
||||
</Link>,
|
||||
addArrow(),
|
||||
];
|
||||
}
|
||||
return [b.title];
|
||||
})
|
||||
: null;
|
||||
|
||||
return <div className={styles.dataBreadCrumb}>You are here: {bC}</div>;
|
||||
}
|
||||
}
|
||||
|
||||
BreadCrumb.propTypes = {
|
||||
breadCrumbs: PropTypes.array.isRequired,
|
||||
};
|
||||
|
||||
export default BreadCrumb;
|
@ -0,0 +1,37 @@
|
||||
import React from 'react';
|
||||
|
||||
import BreadCrumb from '../../Layout/BreadCrumb/BreadCrumb';
|
||||
import Tabs from '../../Layout/ReusableTabs/ReusableTabs';
|
||||
|
||||
class CommonTabLayout extends React.Component {
|
||||
render() {
|
||||
const styles = require('./CommonTabLayout.scss');
|
||||
const {
|
||||
breadCrumbs,
|
||||
heading,
|
||||
appPrefix,
|
||||
currentTab,
|
||||
tabsInfo,
|
||||
baseUrl,
|
||||
showLoader,
|
||||
} = this.props;
|
||||
|
||||
return (
|
||||
<div className={styles.subHeader}>
|
||||
<BreadCrumb breadCrumbs={breadCrumbs} />
|
||||
<h2 className={styles.heading_text + ' ' + styles.set_line_height}>
|
||||
{heading || ''}
|
||||
</h2>
|
||||
<Tabs
|
||||
appPrefix={appPrefix}
|
||||
tabName={currentTab}
|
||||
tabsInfo={tabsInfo}
|
||||
baseUrl={baseUrl}
|
||||
showLoader={showLoader}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
export default CommonTabLayout;
|
@ -0,0 +1 @@
|
||||
@import '../../../Common/Common.scss';
|
@ -0,0 +1,58 @@
|
||||
import React from 'react';
|
||||
|
||||
import LeftNavBar from '../LeftNavBar/LeftNavBar';
|
||||
import Helmet from 'react-helmet';
|
||||
|
||||
import { Link } from 'react-router';
|
||||
|
||||
import PropTypes from 'prop-types';
|
||||
|
||||
class LayoutWrapper extends React.Component {
|
||||
render() {
|
||||
const styles = require('../../Data/TableCommon/Table.scss');
|
||||
const { appPrefix, children } = this.props;
|
||||
// const currentLocation = location ? location.pathname : '';
|
||||
return (
|
||||
<div>
|
||||
<Helmet title={'Custom Resolvers | Hasura'} />
|
||||
<div className={styles.wd20 + ' ' + styles.align_left}>
|
||||
<div
|
||||
className={styles.pageSidebar + ' col-xs-12 ' + styles.padd_remove}
|
||||
>
|
||||
<div>
|
||||
<ul>
|
||||
<li role="presentation">
|
||||
<div className={styles.schemaWrapper}>
|
||||
<div
|
||||
className={styles.schemaSidebarSection}
|
||||
data-test="schema"
|
||||
>
|
||||
<Link
|
||||
className={styles.schemaBorder}
|
||||
to={appPrefix + '/manage'}
|
||||
>
|
||||
Manage
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
<LeftNavBar {...this.props} />
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div className={styles.wd80}>{children}</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
LayoutWrapper.propTypes = {
|
||||
appPrefix: PropTypes.string.isRequired,
|
||||
};
|
||||
|
||||
export default (connect, mapStateToProps, mapDispatchToProps) =>
|
||||
connect(
|
||||
mapStateToProps,
|
||||
mapDispatchToProps
|
||||
)(LayoutWrapper);
|
32
console/src/components/Services/Layout/LeftNavBar/Actions.js
Normal file
@ -0,0 +1,32 @@
|
||||
/* State
|
||||
|
||||
{
|
||||
ongoingRequest : false, //true if request is going on
|
||||
lastError : null OR <string>
|
||||
lastSuccess: null OR <string>
|
||||
}
|
||||
|
||||
*/
|
||||
import defaultState from './State';
|
||||
|
||||
const SET_USERNAME = 'PageContainer/SET_USERNAME';
|
||||
|
||||
// HTML Component defines what state it needs
|
||||
// HTML Component should be able to emit actions
|
||||
// When an action happens, the state is modified (using the reducer function)
|
||||
// When the state is modified, anybody dependent on the state is asked to update
|
||||
// HTML Component is listening to state, hence re-renders
|
||||
|
||||
const homeReducer = (state = defaultState, action) => {
|
||||
switch (action.type) {
|
||||
case SET_USERNAME:
|
||||
return { username: action.data };
|
||||
default:
|
||||
return state;
|
||||
}
|
||||
};
|
||||
|
||||
const setUsername = username => ({ type: SET_USERNAME, data: username });
|
||||
|
||||
export default homeReducer;
|
||||
export { setUsername };
|
@ -0,0 +1,98 @@
|
||||
/* eslint-disable no-unused-vars */
|
||||
|
||||
import React from 'react';
|
||||
import { Link } from 'react-router';
|
||||
|
||||
import { LISTING_SCHEMA } from '../../Data/DataActions';
|
||||
|
||||
const LeftNavBar = ({
|
||||
appPrefix,
|
||||
listItemTemplate,
|
||||
dataList,
|
||||
filtered,
|
||||
searchQuery,
|
||||
location,
|
||||
filterItem,
|
||||
viewResolver,
|
||||
migrationMode,
|
||||
}) => {
|
||||
const styles = require('./LeftNavBar.scss');
|
||||
// Now schema might be null or an empty array
|
||||
|
||||
function tableSearch(e) {
|
||||
const searchTerm = e.target.value;
|
||||
filterItem(dataList, searchTerm);
|
||||
}
|
||||
// TODO: Make it generic so that other components can use it.
|
||||
|
||||
return (
|
||||
<div className={styles.schemaTableList}>
|
||||
<div className={styles.display_flex + ' ' + styles.padd_top_medium}>
|
||||
<div
|
||||
className={
|
||||
styles.sidebarSearch + ' form-group col-xs-12 ' + styles.padd_remove
|
||||
}
|
||||
>
|
||||
<i className="fa fa-search" aria-hidden="true" />
|
||||
<input
|
||||
type="text"
|
||||
onChange={tableSearch.bind(this)}
|
||||
className="form-control"
|
||||
placeholder="Search remote schemas"
|
||||
data-test="search-remote-schemas"
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<div className={styles.sidebarHeadingWrapper}>
|
||||
<div
|
||||
className={
|
||||
'col-xs-8 ' +
|
||||
styles.sidebarHeading +
|
||||
' ' +
|
||||
styles.padd_left_remove
|
||||
}
|
||||
>
|
||||
Remote Schemas ({dataList.length})
|
||||
</div>
|
||||
|
||||
{migrationMode ? (
|
||||
<div
|
||||
className={
|
||||
'col-xs-4 text-center ' +
|
||||
styles.padd_remove +
|
||||
' ' +
|
||||
styles.sidebarCreateTable
|
||||
}
|
||||
>
|
||||
<Link
|
||||
className={styles.padd_remove_full}
|
||||
to={`${appPrefix}/manage/add`}
|
||||
>
|
||||
<button
|
||||
className={styles.add_mar_right + ' btn btn-xs btn-default'}
|
||||
data-test="remote-schema-sidebar-add-table"
|
||||
>
|
||||
Add
|
||||
</button>
|
||||
</Link>
|
||||
</div>
|
||||
) : null}
|
||||
</div>
|
||||
<ul
|
||||
className={styles.schemaListUl}
|
||||
data-test="remote-schema-table-links"
|
||||
>
|
||||
{listItemTemplate(
|
||||
searchQuery ? filtered : dataList,
|
||||
styles,
|
||||
location,
|
||||
viewResolver
|
||||
)}
|
||||
</ul>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default LeftNavBar;
|
@ -0,0 +1,169 @@
|
||||
@import "~bootstrap-sass/assets/stylesheets/bootstrap/variables";
|
||||
@import "../../../Common/Common.scss";
|
||||
.container {
|
||||
}
|
||||
.displayFlexContainer
|
||||
{
|
||||
display: flex;
|
||||
}
|
||||
.flexRow {
|
||||
display: flex;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.add_btn {
|
||||
margin: 10px 0;
|
||||
}
|
||||
|
||||
.account {
|
||||
padding: 20px 0;
|
||||
line-height: 26px;
|
||||
}
|
||||
|
||||
.changeSchema {
|
||||
margin-left: 10px;
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.sidebar {
|
||||
height: calc(100vh - 26px);
|
||||
overflow: auto;
|
||||
// background: #444;
|
||||
// color: $navbar-inverse-color;
|
||||
color: #333;
|
||||
border: 1px solid #E5E5E5;
|
||||
background-color: #F8F8F8;
|
||||
/*
|
||||
a,a:visited {
|
||||
color: $navbar-inverse-link-color;
|
||||
}
|
||||
a:hover {
|
||||
color: $navbar-inverse-link-hover-color;
|
||||
}
|
||||
*/
|
||||
hr {
|
||||
margin: 0;
|
||||
border-color: $navbar-inverse-color;
|
||||
}
|
||||
ul {
|
||||
list-style-type: none;
|
||||
padding-top: 10px;
|
||||
padding-left: 7px;
|
||||
li {
|
||||
padding: 7px 0;
|
||||
transition: color 0.5s;
|
||||
/*
|
||||
a,a:visited {
|
||||
color: $navbar-inverse-link-color;
|
||||
}
|
||||
a:hover {
|
||||
color: $navbar-inverse-link-hover-color;
|
||||
}
|
||||
*/
|
||||
|
||||
a
|
||||
{
|
||||
color: #767E93;
|
||||
word-wrap: break-word;
|
||||
}
|
||||
}
|
||||
li:hover {
|
||||
padding: 7px 0;
|
||||
// color: $navbar-inverse-link-hover-color;
|
||||
transition: color 0.5s;
|
||||
pointer: cursor;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.main {
|
||||
padding: 0;
|
||||
height: $mainContainerHeight;
|
||||
overflow: auto;
|
||||
}
|
||||
|
||||
.sidebarSearch {
|
||||
margin-right: 20px;
|
||||
padding: 10px 0px;
|
||||
padding-bottom: 0px;
|
||||
position: relative;
|
||||
i
|
||||
{
|
||||
position: absolute;
|
||||
padding: 10px;
|
||||
font-size: 14px;
|
||||
padding-left: 8px;
|
||||
color: #979797;
|
||||
}
|
||||
input
|
||||
{
|
||||
padding-left: 25px;
|
||||
}
|
||||
}
|
||||
.sidebarHeadingWrapper
|
||||
{
|
||||
width: 100%;
|
||||
float: left;
|
||||
padding-bottom: 10px;
|
||||
.sidebarHeading {
|
||||
font-weight: bold;
|
||||
display: inline-block;
|
||||
color: #767E93;
|
||||
font-size: 15px;
|
||||
}
|
||||
}
|
||||
.schemaTableList {
|
||||
// max-height: 300px;
|
||||
// overflow-y: auto;
|
||||
overflow: auto;
|
||||
padding-left: 20px;
|
||||
max-height: calc(100vh - 275px);
|
||||
}
|
||||
.schemaListUl {
|
||||
padding-left: 5px;
|
||||
padding-bottom: 10px;
|
||||
li {
|
||||
border-bottom: 0px !important;
|
||||
padding: 0 0 !important;
|
||||
a {
|
||||
background: transparent !important;
|
||||
padding: 5px 0px !important;
|
||||
font-weight: 400 !important;
|
||||
padding-left: 5px !important;
|
||||
.tableIcon {
|
||||
margin-right: 5px;
|
||||
font-size: 12px;
|
||||
}
|
||||
}
|
||||
}
|
||||
.noTables {
|
||||
font-weight: 400 !important;
|
||||
padding-bottom: 10px !important;
|
||||
color: #767E93 !important;
|
||||
}
|
||||
li:first-child {
|
||||
padding-top: 15px !important;
|
||||
}
|
||||
}
|
||||
|
||||
.heading_tooltip {
|
||||
display: inline-block;
|
||||
padding-right: 10px;
|
||||
}
|
||||
|
||||
.addAllBtn {
|
||||
margin-left: 15px;
|
||||
}
|
||||
|
||||
.activeTable {
|
||||
a
|
||||
{
|
||||
// border-left: 4px solid #FFC627;
|
||||
color: #FD9540!important;
|
||||
}
|
||||
}
|
||||
|
||||
.floatRight {
|
||||
float: right;
|
||||
margin-right: 20px;
|
||||
}
|
@ -0,0 +1,5 @@
|
||||
const defaultState = {
|
||||
username: 'Guest User',
|
||||
};
|
||||
|
||||
export default defaultState;
|
204
console/src/components/Services/Layout/ReusableHeader/Header.js
Normal file
@ -0,0 +1,204 @@
|
||||
import React from 'react';
|
||||
|
||||
import { generateHeaderSyms } from './HeaderReducer';
|
||||
import DropdownButton from '../../CustomResolver/Common/DropdownButton';
|
||||
|
||||
class Header extends React.Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
this.state = {
|
||||
...generateHeaderSyms(props.eventPrefix),
|
||||
};
|
||||
}
|
||||
componentWillUnmount() {
|
||||
// Reset the header whenever it is unmounted
|
||||
this.props.dispatch({
|
||||
type: this.state.RESET_HEADER,
|
||||
});
|
||||
}
|
||||
getIndex(e) {
|
||||
const indexId = e.target.getAttribute('data-index-id');
|
||||
return parseInt(indexId, 10);
|
||||
}
|
||||
getTitle(val, k) {
|
||||
return val.filter(v => v.value === k);
|
||||
}
|
||||
headerKeyChange(e) {
|
||||
const indexId = this.getIndex(e);
|
||||
if (indexId < 0) {
|
||||
console.error('Unable to handle event');
|
||||
return;
|
||||
}
|
||||
Promise.all([
|
||||
this.props.dispatch({
|
||||
type: this.state.HEADER_KEY_CHANGE,
|
||||
data: {
|
||||
name: e.target.value,
|
||||
index: indexId,
|
||||
},
|
||||
}),
|
||||
]);
|
||||
}
|
||||
checkAndAddNew(e) {
|
||||
const indexId = this.getIndex(e);
|
||||
if (indexId < 0) {
|
||||
console.error('Unable to handle event');
|
||||
return;
|
||||
}
|
||||
if (
|
||||
this.props.headers[indexId].name &&
|
||||
this.props.headers[indexId].name.length > 0 &&
|
||||
indexId === this.props.headers.length - 1
|
||||
) {
|
||||
Promise.all([this.props.dispatch({ type: this.state.ADD_NEW_HEADER })]);
|
||||
}
|
||||
}
|
||||
headerValueChange(e) {
|
||||
const indexId = this.getIndex(e);
|
||||
if (indexId < 0) {
|
||||
console.error('Unable to handle event');
|
||||
return;
|
||||
}
|
||||
this.props.dispatch({
|
||||
type: this.state.HEADER_VALUE_CHANGE,
|
||||
data: {
|
||||
value: e.target.value,
|
||||
index: indexId,
|
||||
},
|
||||
});
|
||||
}
|
||||
headerTypeChange(e) {
|
||||
const indexId = this.getIndex(e);
|
||||
const typeValue = e.target.getAttribute('value');
|
||||
if (indexId < 0) {
|
||||
console.error('Unable to handle event');
|
||||
return;
|
||||
}
|
||||
this.props.dispatch({
|
||||
type: this.state.HEADER_VALUE_TYPE_CHANGE,
|
||||
data: {
|
||||
type: typeValue,
|
||||
index: indexId,
|
||||
},
|
||||
});
|
||||
}
|
||||
deleteHeader(e) {
|
||||
const indexId = this.getIndex(e);
|
||||
if (indexId < 0) {
|
||||
console.error('Unable to handle event');
|
||||
return;
|
||||
}
|
||||
this.props.dispatch({
|
||||
type: this.state.DELETE_HEADER,
|
||||
data: {
|
||||
type: e.target.value,
|
||||
index: indexId,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
render() {
|
||||
const styles = require('./Header.scss');
|
||||
const { isDisabled } = this.props;
|
||||
const generateHeaderHtml = this.props.headers.map((h, i) => {
|
||||
const getTitle = this.getTitle(this.props.typeOptions, h.type);
|
||||
return (
|
||||
<div
|
||||
className={
|
||||
styles.common_header_wrapper +
|
||||
' ' +
|
||||
styles.display_flex +
|
||||
' form-group'
|
||||
}
|
||||
key={i}
|
||||
>
|
||||
<input
|
||||
type="text"
|
||||
className={
|
||||
styles.input +
|
||||
' form-control ' +
|
||||
styles.add_mar_right +
|
||||
' ' +
|
||||
styles.defaultWidth
|
||||
}
|
||||
data-index-id={i}
|
||||
value={h.name}
|
||||
onChange={this.headerKeyChange.bind(this)}
|
||||
onBlur={this.checkAndAddNew.bind(this)}
|
||||
placeholder={this.props.keyInputPlaceholder}
|
||||
disabled={isDisabled}
|
||||
data-test={`remote-schema-header-test${i + 1}-key`}
|
||||
/>
|
||||
<span className={styles.header_colon}>:</span>
|
||||
<span className={styles.value_wd}>
|
||||
<DropdownButton
|
||||
dropdownOptions={this.props.typeOptions}
|
||||
title={getTitle.length > 0 ? getTitle[0].display_text : 'Value'}
|
||||
dataKey={h.type}
|
||||
dataIndex={i}
|
||||
onButtonChange={this.headerTypeChange.bind(this)}
|
||||
onInputChange={this.headerValueChange.bind(this)}
|
||||
inputVal={h.value}
|
||||
disabled={isDisabled}
|
||||
id={'common-header-' + (i + 1)}
|
||||
inputPlaceHolder={this.props.placeHolderText(h.type)}
|
||||
testId={`remote-schema-header-test${i + 1}`}
|
||||
/>
|
||||
</span>
|
||||
{/*
|
||||
<select
|
||||
className={
|
||||
'form-control ' +
|
||||
styles.add_pad_left +
|
||||
' ' +
|
||||
styles.add_mar_right +
|
||||
' ' +
|
||||
styles.defaultWidth
|
||||
}
|
||||
value={h.type}
|
||||
onChange={this.headerTypeChange.bind(this)}
|
||||
data-index-id={i}
|
||||
disabled={isDisabled}
|
||||
>
|
||||
<option disabled value="">
|
||||
-- value type --
|
||||
</option>
|
||||
{this.props.typeOptions.map((o, k) => (
|
||||
<option key={k} value={o.value} data-index-id={i}>
|
||||
{o.display}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
<input
|
||||
type="text"
|
||||
className={
|
||||
styles.inputDefault +
|
||||
' form-control ' +
|
||||
styles.defaultWidth +
|
||||
' ' +
|
||||
styles.add_pad_left
|
||||
}
|
||||
placeholder="value"
|
||||
value={h.value}
|
||||
onChange={this.headerValueChange.bind(this)}
|
||||
data-index-id={i}
|
||||
disabled={isDisabled}
|
||||
/>
|
||||
*/}
|
||||
{i !== this.props.headers.length - 1 && !isDisabled ? (
|
||||
<i
|
||||
className={styles.fontAwosomeClose + ' fa-lg fa fa-times'}
|
||||
onClick={this.deleteHeader.bind(this)}
|
||||
data-index-id={i}
|
||||
/>
|
||||
) : null}
|
||||
</div>
|
||||
);
|
||||
});
|
||||
return <div className={this.props.wrapper_class}>{generateHeaderHtml}</div>;
|
||||
}
|
||||
}
|
||||
|
||||
// Add proptypes
|
||||
|
||||
export default Header;
|
@ -0,0 +1,19 @@
|
||||
@import "../../../Common/Common.scss";
|
||||
|
||||
.common_header_wrapper {
|
||||
.defaultWidth
|
||||
{
|
||||
width: 300px;
|
||||
}
|
||||
.add_mar_right {
|
||||
margin-right: 10px !important;
|
||||
}
|
||||
.header_colon {
|
||||
margin-right: 10px;
|
||||
font-weight: bold;
|
||||
font-size: 24px;
|
||||
}
|
||||
.value_wd {
|
||||
width: 300px;
|
||||
}
|
||||
}
|
@ -0,0 +1,117 @@
|
||||
/* Default state */
|
||||
const defaultState = {
|
||||
headers: [
|
||||
{
|
||||
name: '',
|
||||
type: '',
|
||||
value: '',
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
/* */
|
||||
|
||||
/* Action constants */
|
||||
const generateHeaderSyms = (prefix = 'API_HEADER') => {
|
||||
return {
|
||||
HEADER_KEY_CHANGE: `${prefix}/HEADER_KEY_CHANGE`,
|
||||
HEADER_VALUE_TYPE_CHANGE: `${prefix}/HEADER_VALUE_TYPE_CHANGE`,
|
||||
HEADER_VALUE_CHANGE: `${prefix}/HEADER_VALUE_CHANGE`,
|
||||
UPDATE_HEADERS: `${prefix}/UPDATE_HEADERS`,
|
||||
RESET_HEADER: `${prefix}/RESET_HEADER`,
|
||||
ADD_NEW_HEADER: `${prefix}/ADD_NEW_HEADER`,
|
||||
DELETE_HEADER: `${prefix}/DELETE_HEADER`,
|
||||
};
|
||||
};
|
||||
/* */
|
||||
|
||||
const generateReducer = (eventPrefix, defaultHeaders) => {
|
||||
/* Action constants */
|
||||
if (defaultHeaders && defaultHeaders.length > 0) {
|
||||
defaultState.headers = [...defaultHeaders];
|
||||
}
|
||||
const {
|
||||
HEADER_KEY_CHANGE,
|
||||
HEADER_VALUE_CHANGE,
|
||||
HEADER_VALUE_TYPE_CHANGE,
|
||||
RESET_HEADER,
|
||||
DELETE_HEADER,
|
||||
ADD_NEW_HEADER,
|
||||
UPDATE_HEADERS,
|
||||
} = generateHeaderSyms(eventPrefix);
|
||||
/* */
|
||||
|
||||
/* Reducer */
|
||||
const headerReducer = (state = defaultState, action) => {
|
||||
switch (action.type) {
|
||||
case HEADER_KEY_CHANGE:
|
||||
return {
|
||||
...state,
|
||||
headers: [
|
||||
...state.headers.slice(0, action.data.index),
|
||||
{
|
||||
...state.headers[action.data.index],
|
||||
name: action.data.name,
|
||||
},
|
||||
...state.headers.slice(action.data.index + 1, state.headers.length),
|
||||
],
|
||||
};
|
||||
case HEADER_VALUE_TYPE_CHANGE:
|
||||
return {
|
||||
...state,
|
||||
headers: [
|
||||
...state.headers.slice(0, action.data.index),
|
||||
{
|
||||
...state.headers[action.data.index],
|
||||
type: action.data.type,
|
||||
},
|
||||
...state.headers.slice(action.data.index + 1, state.headers.length),
|
||||
],
|
||||
};
|
||||
case HEADER_VALUE_CHANGE:
|
||||
return {
|
||||
...state,
|
||||
headers: [
|
||||
...state.headers.slice(0, action.data.index),
|
||||
{
|
||||
...state.headers[action.data.index],
|
||||
value: action.data.value,
|
||||
},
|
||||
...state.headers.slice(action.data.index + 1, state.headers.length),
|
||||
],
|
||||
};
|
||||
case ADD_NEW_HEADER:
|
||||
return {
|
||||
...state,
|
||||
headers: [...state.headers, { ...defaultState.headers[0] }],
|
||||
};
|
||||
|
||||
case DELETE_HEADER:
|
||||
return {
|
||||
...state,
|
||||
headers: [
|
||||
...state.headers.slice(0, action.data.index),
|
||||
...state.headers.slice(action.data.index + 1, state.headers.length),
|
||||
],
|
||||
};
|
||||
case RESET_HEADER:
|
||||
return {
|
||||
...defaultState,
|
||||
};
|
||||
case UPDATE_HEADERS:
|
||||
return {
|
||||
...state,
|
||||
headers: [...action.data],
|
||||
};
|
||||
default:
|
||||
return {
|
||||
...state,
|
||||
};
|
||||
}
|
||||
};
|
||||
return headerReducer;
|
||||
/* */
|
||||
};
|
||||
|
||||
export { generateHeaderSyms };
|
||||
export default generateReducer;
|
@ -0,0 +1,41 @@
|
||||
import React from 'react';
|
||||
import { Link } from 'react-router';
|
||||
|
||||
const Tabs = ({ appPrefix, tabsInfo, tabName, count, baseUrl, showLoader }) => {
|
||||
let showCount = '';
|
||||
if (!(count === null || count === undefined)) {
|
||||
showCount = '(' + count + ')';
|
||||
}
|
||||
const styles = require('./ReusableTabs.scss');
|
||||
const dataLoader = () => {
|
||||
return (
|
||||
<span className={styles.loader_ml}>
|
||||
<i className="fa fa-spinner fa-spin" />
|
||||
</span>
|
||||
);
|
||||
};
|
||||
return [
|
||||
<div className={styles.common_nav} key={'reusable-tabs-1'}>
|
||||
<ul className="nav nav-pills">
|
||||
{Object.keys(tabsInfo).map((t, i) => (
|
||||
<li
|
||||
role="presentation"
|
||||
className={tabName === t ? styles.active : ''}
|
||||
key={i}
|
||||
>
|
||||
<Link
|
||||
to={`${baseUrl}/${t}`}
|
||||
data-test={`${appPrefix.slice(1)}-${t}`}
|
||||
>
|
||||
{tabsInfo[t].display_text} {tabName === t ? showCount : null}
|
||||
{tabName === t && showLoader ? dataLoader() : null}
|
||||
</Link>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
</div>,
|
||||
<div className="clearfix" key={'reusable-tabs-2'} />,
|
||||
];
|
||||
};
|
||||
|
||||
export default Tabs;
|
@ -0,0 +1,5 @@
|
||||
@import "../../../Common/Common.scss";
|
||||
|
||||
.loader_ml {
|
||||
margin-left: 3px;
|
||||
}
|
@ -0,0 +1,80 @@
|
||||
/* eslint-disable space-infix-ops */
|
||||
/* eslint-disable no-loop-func */
|
||||
|
||||
import PropTypes from 'prop-types';
|
||||
|
||||
import React, { Component } from 'react';
|
||||
import Helmet from 'react-helmet';
|
||||
import { push } from 'react-router-redux';
|
||||
import { loadTriggers } from '../EventActions';
|
||||
import globals from '../../../../Globals';
|
||||
|
||||
const appPrefix = globals.urlPrefix + '/events';
|
||||
|
||||
class Schema extends Component {
|
||||
constructor(props) {
|
||||
super(props);
|
||||
// Initialize this table
|
||||
const dispatch = this.props.dispatch;
|
||||
dispatch(loadTriggers());
|
||||
}
|
||||
|
||||
render() {
|
||||
const { migrationMode, dispatch } = this.props;
|
||||
|
||||
const styles = require('../PageContainer/PageContainer.scss');
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`${styles.padd_left_remove} container-fluid ${
|
||||
styles.padd_top
|
||||
}`}
|
||||
>
|
||||
<div className={styles.padd_left}>
|
||||
<Helmet title="Event Triggers | Hasura" />
|
||||
<div>
|
||||
<h2 className={`${styles.heading_text} ${styles.inline_block}`}>
|
||||
{' '}
|
||||
Event Triggers{' '}
|
||||
</h2>
|
||||
{migrationMode ? (
|
||||
<button
|
||||
data-test="data-create-trigger"
|
||||
className={styles.yellow_button}
|
||||
onClick={e => {
|
||||
e.preventDefault();
|
||||
dispatch(push(`${appPrefix}/manage/triggers/add`));
|
||||
}}
|
||||
>
|
||||
Create Trigger
|
||||
</button>
|
||||
) : null}
|
||||
</div>
|
||||
<hr />
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
Schema.propTypes = {
|
||||
schema: PropTypes.array.isRequired,
|
||||
untracked: PropTypes.array.isRequired,
|
||||
untrackedRelations: PropTypes.array.isRequired,
|
||||
migrationMode: PropTypes.bool.isRequired,
|
||||
currentSchema: PropTypes.string.isRequired,
|
||||
dispatch: PropTypes.func.isRequired,
|
||||
};
|
||||
|
||||
const mapStateToProps = state => ({
|
||||
schema: state.tables.allSchemas,
|
||||
schemaList: state.tables.schemaList,
|
||||
untracked: state.tables.untrackedSchemas,
|
||||
migrationMode: state.main.migrationMode,
|
||||
untrackedRelations: state.tables.untrackedRelations,
|
||||
currentSchema: state.tables.currentSchema,
|
||||
});
|
||||
|
||||
const schemaConnector = connect => connect(mapStateToProps)(Schema);
|
||||
|
||||
export default schemaConnector;
|
@ -0,0 +1,31 @@
|
||||
import React from 'react';
|
||||
|
||||
const SchemaContainer = ({ children }) => {
|
||||
const styles = require('./SchemaContainer.scss');
|
||||
return (
|
||||
<div className={styles.container + ' container-fluid'}>
|
||||
<div className="row">
|
||||
<div
|
||||
className={
|
||||
styles.main + ' ' + styles.padd_left_remove + ' ' + styles.padd_top
|
||||
}
|
||||
>
|
||||
<div className={styles.rightBar + ' '}>
|
||||
{children && React.cloneElement(children)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
const mapStateToProps = state => {
|
||||
return {
|
||||
schema: state.tables.allSchemas,
|
||||
};
|
||||
};
|
||||
|
||||
const schemaContainerConnector = connect =>
|
||||
connect(mapStateToProps)(SchemaContainer);
|
||||
|
||||
export default schemaContainerConnector;
|
@ -0,0 +1,63 @@
|
||||
@import "~bootstrap-sass/assets/stylesheets/bootstrap/variables";
|
||||
@import "../../../Common/Common.scss";
|
||||
|
||||
.flexRow {
|
||||
display: flex;
|
||||
}
|
||||
.padd_left_remove
|
||||
{
|
||||
padding-left: 0;
|
||||
}
|
||||
.add_btn {
|
||||
margin: 10px 0;
|
||||
}
|
||||
|
||||
.account {
|
||||
padding: 20px 0;
|
||||
line-height: 26px;
|
||||
}
|
||||
|
||||
.sidebar {
|
||||
height: $mainContainerHeight;
|
||||
overflow: auto;
|
||||
background: #444;
|
||||
color: $navbar-inverse-color;
|
||||
hr {
|
||||
margin: 0;
|
||||
border-color: $navbar-inverse-color;
|
||||
}
|
||||
ul {
|
||||
list-style-type: none;
|
||||
padding-top: 10px;
|
||||
padding-left: 7px;
|
||||
li {
|
||||
padding: 7px 0;
|
||||
transition: color 0.5s;
|
||||
a,a:visited {
|
||||
color: $navbar-inverse-link-color;
|
||||
}
|
||||
a:hover {
|
||||
color: $navbar-inverse-link-hover-color;
|
||||
text-decoration: none;
|
||||
}
|
||||
}
|
||||
li:hover {
|
||||
padding: 7px 0;
|
||||
color: $navbar-inverse-link-hover-color;
|
||||
transition: color 0.5s;
|
||||
pointer: cursor;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
.main {
|
||||
padding: 0;
|
||||
padding-left: 15px;
|
||||
height: $mainContainerHeight;
|
||||
overflow: auto;
|
||||
padding-right: 15px;
|
||||
}
|
||||
|
||||
.rightBar {
|
||||
padding-left: 15px;
|
||||
}
|
3
console/src/components/Services/Layout/index.js
Normal file
@ -0,0 +1,3 @@
|
||||
export layoutConnector from './LayoutWrapper/LayoutWrapper';
|
||||
export LeftNavBar from './LeftNavBar/LeftNavBar';
|
||||
export rightBar from './RightLayoutWrapper/SchemaContainer';
|
@ -9,6 +9,7 @@ const componentsSemver = {
|
||||
supportColumnChangeTrigger: '1.0.0-alpha26',
|
||||
analyzeApiChange: '1.0.0-alpha26',
|
||||
insertPrefix: '1.0.0-alpha26',
|
||||
schemaStitching: '1.0.0-alpha30',
|
||||
webhookEnvSupport: '1.0.0-alpha29',
|
||||
insertPermRestrictColumns: '1.0.0-alpha28',
|
||||
};
|
||||
|
@ -2,6 +2,7 @@ import { combineReducers } from 'redux';
|
||||
import { routerReducer } from 'react-router-redux';
|
||||
import { dataReducer } from './components/Services/Data';
|
||||
import { eventReducer } from './components/Services/EventTrigger';
|
||||
import { customResolverReducer } from './components/Services/CustomResolver';
|
||||
import mainReducer from './components/Main/Actions';
|
||||
import apiExplorerReducer from 'components/ApiExplorer/Actions';
|
||||
import progressBarReducer from 'components/App/Actions';
|
||||
@ -15,6 +16,7 @@ const reducer = combineReducers({
|
||||
apiexplorer: apiExplorerReducer,
|
||||
main: mainReducer,
|
||||
routing: routerReducer,
|
||||
customResolverData: customResolverReducer,
|
||||
notifications,
|
||||
});
|
||||
|
||||
|
@ -23,6 +23,8 @@ import globals from './Globals';
|
||||
|
||||
import validateLogin from './components/Common/validateLogin';
|
||||
|
||||
import { getCustomResolverRouter } from './components/Services/CustomResolver';
|
||||
|
||||
const routes = store => {
|
||||
// load hasuractl migration status
|
||||
const requireMigrationStatus = (nextState, replaceState, cb) => {
|
||||
@ -60,6 +62,11 @@ const routes = store => {
|
||||
const makeDataRouter = dataRouterUtils.makeDataRouter;
|
||||
const makeEventRouter = eventRouterUtils.makeEventRouter;
|
||||
|
||||
const customResolverRouter = getCustomResolverRouter(
|
||||
connect,
|
||||
store,
|
||||
composeOnEnterHooks
|
||||
);
|
||||
return (
|
||||
<Route path="/" component={App} onEnter={validateLogin(store)}>
|
||||
<Route path="login" component={generatedLoginConnector(connect)} />
|
||||
@ -77,6 +84,7 @@ const routes = store => {
|
||||
<Route path="metadata" component={metadataConnector(connect)} />
|
||||
{makeDataRouter}
|
||||
{makeEventRouter}
|
||||
{customResolverRouter}
|
||||
</Route>
|
||||
</Route>
|
||||
<Route path="404" component={PageNotFound} status="404" />
|
||||
|
10
console/src/theme/bootstrap.overrides.scss
vendored
@ -12,6 +12,16 @@ input[type="radio"], input[type="checkbox"]
|
||||
{
|
||||
margin: 0 5px 0px 0px;
|
||||
}
|
||||
.dropdown.input-group-btn {
|
||||
button {
|
||||
min-height: 34px;
|
||||
}
|
||||
}
|
||||
|
||||
.form-group .input-group {
|
||||
min-width: 300px;
|
||||
}
|
||||
|
||||
label
|
||||
{
|
||||
margin-bottom: 0 !important;
|
||||
|
@ -84,7 +84,7 @@ algolia_index:
|
||||
$(call check_defined, ALGOLIA_INDEX_NAME)
|
||||
|
||||
export ALGOLIA_APPLICATION_ID=${ALGOLIA_APPLICATION_ID} ALGOLIA_ADMIN_KEY=${ALGOLIA_ADMIN_KEY} ALGOLIA_INDEX_NAME=${ALGOLIA_INDEX_NAME}
|
||||
python ./algolia_index/algolia_index.py _build/algolia_index/index.json
|
||||
python3 ./algolia_index/algolia_index.py _build/algolia_index/index.json
|
||||
|
||||
|
||||
.PHONY: dirhtml
|
||||
|
101
docs/graphql/manual/business-logic/index.rst.wip
Normal file
@ -0,0 +1,101 @@
|
||||
Custom business logic
|
||||
=====================
|
||||
|
||||
For the backends of most apps, you may have to implement custom business logic to complement the CRUD and
|
||||
real-time API provided by GraphQL Engine. Depending on the nature of the use case and its position vis-a-vis
|
||||
GraphQL Engine/Postgres, different avenues are recommended for introducing such business logic in your app's backend:
|
||||
|
||||
|
||||
- **Pre-CRUD**: :ref:`remote-schemas`
|
||||
- **Post-CRUD**: :ref:`event-triggers`
|
||||
- :ref:`derived-data`
|
||||
|
||||
.. image:: ../../../img/graphql/manual/business-logic/custom-business-logic.png
|
||||
|
||||
.. _remote-schemas:
|
||||
|
||||
Custom resolvers in remote schemas
|
||||
----------------------------------
|
||||
|
||||
Merging remote schemas is ideal for adding "pre-CRUD" business logic (*logic to be run before you invoke
|
||||
GraphQL Engine's GraphQL API to insert/modify data in Postgres*) or custom business logic that is not part of
|
||||
your GraphQL Engine schema. Here are some use-cases where remote schemas are ideal:
|
||||
|
||||
- Customizing mutations (e.g. running validations before inserts)
|
||||
- Supporting features like payments, etc. and providing a consistent interface to access them i.e. behind the
|
||||
GraphQL Engine’s API
|
||||
- Fetching disparate data from other sources (e.g. from a weather API or another database)
|
||||
|
||||
To support these kinds of business logic, a custom GraphQL schema with resolvers that implement said business
|
||||
logic is needed (*see link below for boilerplates*). This remote schema can then be merged with GraphQL Engine's
|
||||
schema using the console. Here's a reference architecture diagram for such a setup:
|
||||
|
||||
.. image:: ../../../img/graphql/manual/schema/schema-stitching-v1-arch-diagram.png
|
||||
|
||||
For more details, links to boilerplates for custom GraphQL servers, etc. please head to :doc:`../remote-schemas/index`.
|
||||
|
||||
.. _event-triggers:
|
||||
|
||||
Asynchronous business logic / Events triggers
|
||||
---------------------------------------------
|
||||
|
||||
"post-CRUD" business logic (*follow up logic to be run after GraphQL Engine's GraphQL API has been used to insert
|
||||
or modify data in Postgres*) typically tends to be asynchronous, stateless and is triggered on changes to data
|
||||
relevant to each use case. E.g. for every new user in your database, you may want to send out a notification. This
|
||||
business logic is triggered for every new row in your ``users`` table.
|
||||
|
||||
GraphQL Engine comes with built-in events triggers on tables in the Postgres database. These triggers capture events
|
||||
on specified tables and then invoke configured webhooks, which contain your business logic.
|
||||
|
||||
If your business logic is stateful, it can even store its state back in the Postgres instance configured to work
|
||||
with GraphQL Engine, allowing your frontend app to offer a reactive user experience, where the app uses GraphQL
|
||||
subscriptions to listen to updates from your webhook via Postgres.
|
||||
|
||||
.. image:: ../../../img/graphql/manual/event-triggers/database-event-triggers.png
|
||||
|
||||
Event triggers are ideal for use cases such as the following:
|
||||
|
||||
- Notifications: Trigger push notifications and emails based on database events
|
||||
|
||||
- ETL: Transform and load data into external data-stores.
|
||||
|
||||
- E.g. transform data from Postgres and populate an Algolia index when a product is inserted, updated or deleted.
|
||||
|
||||
- Long-running business logic:
|
||||
|
||||
- Provision some infrastructure
|
||||
- Process multimedia files
|
||||
- Background jobs
|
||||
|
||||
- Cache/CDN purge: invalidate/update entries in your cache/CDN when the underlying data in Postgres changes.
|
||||
|
||||
For more information on event triggers and how to set them up, please see :doc:`../event-triggers/index`.
|
||||
|
||||
.. _derived-data:
|
||||
|
||||
Derived data / Data transformations
|
||||
-----------------------------------
|
||||
|
||||
For some use cases, you may want to transform your data in Postgres or run some predetermined function on it to
|
||||
derive another dataset (*that will be queried using GraphQL Engine*). E.g. let's say you store each user's location
|
||||
data in the database as a ``point`` type. You are interested in calculating the distance (*say the haversine distance*)
|
||||
between each set of two users i.e. you want this derived dataset:
|
||||
|
||||
.. list-table::
|
||||
:header-rows: 1
|
||||
|
||||
* - user_id_1
|
||||
- user_id_2
|
||||
- distance between users
|
||||
* - 12
|
||||
- 23
|
||||
- 10.50
|
||||
* - 12
|
||||
- 47
|
||||
- 76.00
|
||||
|
||||
The easiest way to handle these kinds of use cases is to create a view, which encapsulates your business logic
|
||||
(*in our example, calculating the distance between any two users*), and query your derived/transformed data as you
|
||||
would a table using GraphQL Engine (*with permissions defined explicitly for your view if needed*).
|
||||
|
||||
For more information on how to do this, please see :doc:`../queries/derived-data`.
|
@ -5,6 +5,7 @@ Hasura can be used to create event triggers on tables in the Postgres database.
|
||||
events happening on the specified tables and then call configured webhooks to carry out some business logic.
|
||||
|
||||
.. image:: ../../../img/graphql/manual/event-triggers/database-event-triggers.png
|
||||
:class: no-shadow
|
||||
|
||||
See:
|
||||
^^^^
|
||||
|
@ -17,6 +17,7 @@ The Hasura GraphQL engine lets you setup a GraphQL server and event triggers ove
|
||||
queries/index
|
||||
mutations/index
|
||||
subscriptions/index
|
||||
remote-schemas/index
|
||||
event-triggers/index
|
||||
auth/index
|
||||
migrations/index
|
||||
|
116
docs/graphql/manual/remote-schemas/index.rst
Normal file
@ -0,0 +1,116 @@
|
||||
Remote schemas
|
||||
==============
|
||||
|
||||
Hasura gives you CRUD + realtime GraphQL APIs with authorization & access control. However, in many cases, you will need to write APIs (queries, mutations) that contain custom logic. For example, implementing a payment API, or querying data that is not in your database.
|
||||
|
||||
Hasura has the ability to merge remote GraphQL schemas and provide a unified GraphQL API. Think of it
|
||||
like automated schema stitching. All you need to do is build your own GraphQL service and then provide the HTTP endpoint to Hasura. Your GraphQL service can be written in any language or framework.
|
||||
|
||||
This is what Hasura running with "Remote schemas" looks like:
|
||||
|
||||
|
||||
.. image:: ../../../img/graphql/manual/remote-schemas/remote-schemas-arch.png
|
||||
:class: no-shadow
|
||||
:width: 500px
|
||||
|
||||
.. note::
|
||||
|
||||
Note that is a new feature in active development. Please do give us feedback, bug-reports and ask
|
||||
us questions on our `discord <https://discord.gg/vBPpJkS>`__ or on `github <https://github.com/hasura/graphql-engine>`__.
|
||||
|
||||
Use-cases
|
||||
---------
|
||||
|
||||
- Custom business logic, like a payment API
|
||||
- Querying data that is not available in your database
|
||||
|
||||
|
||||
You can handle these use-cases by writing resolvers in a custom GraphQL server and making Hasura merge this ``remote schema`` with the existing autogenerated schema. You can also add multiple remote schemas. Think of the merged schema as a union of top-level nodes from each of the sub-schemas.
|
||||
|
||||
Note that if you are looking for adding authorization & access control for your app users
|
||||
to the GraphQL APIs that are auto-generated via Hasura, head to :doc:`Authorization / Access control <../auth/index>`
|
||||
|
||||
How-to
|
||||
------
|
||||
|
||||
Follow the steps below to add your "remote schema" to hasura.
|
||||
|
||||
Step-1: Write a custom GraphQL server
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
You need to create a custom GraphQL server with a schema and corresponding resolvers that solve your use case
|
||||
(*if you already have a functional GraphQL server that meets your requirements, you can skip this step*). You can
|
||||
use any language/framework of your choice to author this server or deploy it anywhere. A great way to get started
|
||||
is to use one of our boilerplates:
|
||||
|
||||
- `Boilerplates <https://github.com/hasura/graphql-engine/tree/master/community/boilerplates/graphql-servers>`__
|
||||
- `Serverless boilerplates <https://github.com/hasura/graphql-serverless>`__
|
||||
|
||||
.. note::
|
||||
|
||||
**Current limitations**:
|
||||
|
||||
- Nomenclature: Type names and node names need to be unique across all merged schemas (*case-sensitive match*). In the next few iterations, support for merging types with the exact same name and structure will be available.
|
||||
- Nodes from different GraphQL servers cannot be used in the same query/mutation. All top-level nodes have to be from the same GraphQL server.
|
||||
- Subscriptions on remote GraphQL server are not supported.
|
||||
- Interfaces_ and Unions_ are not supported - if a remote schema has interfaces/unions, an error will be thrown if you try to merge it.
|
||||
|
||||
These limitations will be addressed in upcoming versions.
|
||||
|
||||
Step-2: Merge remote schema
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Head to the console to merge your remote schema with GraphQL Engine's auto-generated schema. In a top level tab,
|
||||
named ``Remote Schemas``, click on the ``Add`` button.
|
||||
|
||||
.. image:: ../../../img/graphql/manual/business-logic/add-remote-schemas-interface.png
|
||||
|
||||
|
||||
You need to enter the following information:
|
||||
|
||||
- **Remote Schema name**: an alias for the remote schema that must be unique on an instance of GraphQL Engine.
|
||||
- **GraphQL server URL**: the endpoint at which your remote GraphQL server is available. This value can be entered
|
||||
manually or by specifying an environment variable that contains this information. If you want to specify an
|
||||
environment variable, please note that currently there is no validation that the environment variable is
|
||||
actually available at the time of this configuration, so any errors in this configuration will result in a
|
||||
runtime error.
|
||||
- **Headers**: configure the headers to be sent to your custom GraphQL server.
|
||||
|
||||
- Toggle forwarding all headers sent by the client (when making a GraphQL query) to your remote GraphQL server.
|
||||
- Send additional headers to your remote server - These can be static header name-value pairs; and/or pairs of "header name-environment variable name".
|
||||
You can specify the value of the header to picked up from the enviroment variable.
|
||||
|
||||
**Example**: Let's say your remote GraphQL server needs a ``X-Api-Key`` as a header. As this value contains sensitive data (like API key in this
|
||||
example), you can configure name of an environment variable which will hold the value. This environment variable needs to be present when you start
|
||||
GraphQL Engine. When Hasura sends requests to your remote server, it will pick up the value from this environment variable.
|
||||
|
||||
.. note::
|
||||
|
||||
If the remote schema configuration contains environment variables - either
|
||||
for URL or headers - **environment variables need to be present** (GraphQL
|
||||
engine should be started with these env variables) with valid values, when
|
||||
adding the remote schema.
|
||||
|
||||
Click on the ``Add Remote Schema`` button to merge the remote schema.
|
||||
|
||||
|
||||
Step-3: Make queries to the remote server from Hasura
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
Now you can head to *GraphiQL* and make queries to your remote server from Hasura.
|
||||
|
||||
Query your remote server by making requests to the Hasura graphql endpoint (``/v1alpha1/graphql``).
|
||||
|
||||
|
||||
.. note::
|
||||
|
||||
For some use cases, you may need to extend the GraphQL schema fields exposed by Hasura GraphQL engine
|
||||
(*and not merely augment as we have done above*) with a custom schema/server. To support them, you can use
|
||||
community tooling to write your own client-facing GraphQL gateway that interacts with GraphQL Engine.
|
||||
|
||||
But adding an additional layer on top of Hasura GraphQL engine significantly impacts the performance provided by it
|
||||
out of the box (*by as much as 4x*). If you need any help with remodeling these kind of use cases to use the
|
||||
built-in remote schemas feature, please get in touch with us on `Discord <https://discord.gg/vBPpJkS>`__.
|
||||
|
||||
|
||||
.. _Interfaces: https://graphql.github.io/learn/schema/#interfaces
|
||||
.. _Unions: https://graphql.github.io/learn/schema/#union-types
|
@ -1,30 +0,0 @@
|
||||
|
||||
Adding custom GraphQL resolvers
|
||||
===============================
|
||||
|
||||
Hasura GraphQL engine provides instant GraphQL APIs over the tables and views of any Postgres database by
|
||||
auto-generating the CRUD resolvers. However, sometimes you might have to write custom resolvers to capture business
|
||||
logic that is unrelated to the database.
|
||||
|
||||
We have set up `this boilerplate project <https://github.com/hasura/graphql-engine/tree/master/community/boilerplates/custom-resolvers>`_
|
||||
illustrating how to write your own custom GraphQL resolvers and merge them with the Hasura GraphQL engine's resolvers.
|
||||
|
||||
Follow the boilerplate's ``README.md`` for detailed instructions.
|
||||
|
||||
TL;DR
|
||||
-----
|
||||
The boilerplate includes sample custom resolvers for:
|
||||
|
||||
- A ``hello`` query which returns a fixed string.
|
||||
- A ``count`` query that returns a counter from some other data source.
|
||||
- A ``increment_counter`` mutation that increments the value of the above counter.
|
||||
- A ``user_average_age`` query that directly makes an SQL query to Postgres using Knex and returns the result.
|
||||
|
||||
The steps to achieve this are:
|
||||
|
||||
- Create the query/mutation types for your custom GraphQL API.
|
||||
- Write the custom resolver code for the above types.
|
||||
- Make a new GraphQL schema out of these custom resolvers.
|
||||
- Merge this schema with the existing Hasura GraphQL schema and serve the resulting GraphQL API.
|
||||
|
||||
The above steps are implemented using `Apollo's graphql-tools library <https://github.com/apollographql/graphql-tools>`__.
|
@ -20,11 +20,8 @@ See:
|
||||
Basics <basics>
|
||||
Relationships <relationships/index>
|
||||
Customise with views <views>
|
||||
Customise with schema stitching <schema-stitching>
|
||||
Adding custom resolvers <custom-resolvers>
|
||||
Default field values <default-values/index>
|
||||
Enum type fields <enums>
|
||||
Using an existing database <using-existing-database>
|
||||
Export GraphQL schema <export-graphql-schema>
|
||||
How schema generation works <how-it-works>
|
||||
|
||||
|
@ -1,132 +0,0 @@
|
||||
========================
|
||||
GraphQL Schema Stitching
|
||||
========================
|
||||
|
||||
Schema stitching is the process of creating a single GraphQL schema from multiple underlying GraphQL APIs.
|
||||
|
||||
If you need to add custom business logic or customize your GraphQL schema then we recommend using schema stitching.
|
||||
|
||||
Here are 2 common use cases:
|
||||
|
||||
- Fetch data from sources that are not in the database (eg: a weather API)
|
||||
- Customize mutations (eg: running validations before inserts)
|
||||
|
||||
.. note::
|
||||
|
||||
If you are looking for ``graphql-bindings``, please check out `this git repository
|
||||
<https://github.com/hasura/generate-graphql-bindings>`_.
|
||||
|
||||
Schema stitching allows you to have one unified API that allows the client to query multiple GraphQL Schemas at the
|
||||
same time, including relations between the schemas.
|
||||
|
||||
.. image:: ../../../img/graphql/manual/schema/graphql-schema-stitching.png
|
||||
:scale: 50%
|
||||
|
||||
In the above architecture, we see that there are multiple decoupled graphql services running somewhere and a central
|
||||
server acts as a GraphQL Proxy server and it combines the different schemas into a unified API that the client can
|
||||
query on.
|
||||
|
||||
Let's go through a simple example to understand why schema stitching might be required in certain use cases and how
|
||||
it can be leveraged to unify all your APIs to a single GraphQL API.
|
||||
|
||||
Assume the following database schema in PostgreSQL:
|
||||
|
||||
+----------------------------------------+----------------------------------------+
|
||||
|Table |Columns |
|
||||
+========================================+========================================+
|
||||
|person |id, name, city |
|
||||
+----------------------------------------+----------------------------------------+
|
||||
|
||||
We have a simple ``person`` table with columns ``id``, ``name`` and ``city``. For this example, the above table has
|
||||
anonymous select permission.
|
||||
|
||||
The GraphQL query in Hasura Data API for the above table would look like:
|
||||
|
||||
.. code-block:: graphql
|
||||
|
||||
query fetch_person {
|
||||
person {
|
||||
id
|
||||
name
|
||||
city
|
||||
}
|
||||
}
|
||||
|
||||
This is a simple select on table person.
|
||||
|
||||
On the other hand, we have a GraphQL server for fetching weather information that connects to ``Meta Weather API``.
|
||||
|
||||
The GraphQL schema for this weather API looks like:
|
||||
|
||||
.. code-block:: graphql
|
||||
|
||||
type CityWeather {
|
||||
temp: String
|
||||
min_temp: String
|
||||
max_temp: String
|
||||
city_name: String!
|
||||
applicable_date: String!
|
||||
}
|
||||
|
||||
.. code-block:: graphql
|
||||
|
||||
type Query {
|
||||
cityWeather(city_name: String! applicable_date: String): CityWeather
|
||||
}
|
||||
|
||||
The GraphQL query to fetch this weather information would look like:
|
||||
|
||||
.. code-block:: graphql
|
||||
|
||||
query {
|
||||
cityWeather (city_name: "Bangalore") {
|
||||
city_name
|
||||
temp
|
||||
min_temp
|
||||
max_temp
|
||||
applicable_date
|
||||
}
|
||||
}
|
||||
|
||||
Explore this API on `Apollo LaunchPad <https://launchpad.graphql.com/nxw8w0z9q7>`_.
|
||||
|
||||
Note the usage of ``city_name`` as an argument for the ``cityWeather`` query. Using this we can extend our original
|
||||
Postgres's ``person`` schema to include weather information based on the ``city`` column of the person table.
|
||||
|
||||
.. code-block:: graphql
|
||||
|
||||
extend type person {
|
||||
city_weather: CityWeather,
|
||||
}
|
||||
|
||||
We have extended the type person to have one more field called ``city_weather``. This will resolve to the weather
|
||||
schema defined above and the respective resolver will return appropriate data.
|
||||
|
||||
The source code for the custom resolver can be found on GitHub - `graphql-schema-stitching-demo
|
||||
<https://github.com/hasura/graphql-schema-stitching-demo>`_. Note the usage of ``mergeSchemas``, a
|
||||
``graphql-tools`` utility that enables schema stitching.
|
||||
|
||||
Now the merged schema can be queried as:
|
||||
|
||||
.. code-block:: graphql
|
||||
|
||||
query {
|
||||
person {
|
||||
id
|
||||
name
|
||||
city
|
||||
city_weather {
|
||||
city_name
|
||||
temp
|
||||
min_temp
|
||||
max_temp
|
||||
applicable_date
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
This is a neat abstraction for the client making the GraphQL API, as all the merging of different schemas are
|
||||
done by the server and exposed as a single API.
|
||||
|
||||
Read the official docs on `schema stitching <https://www.apollographql.com/docs/graphql-tools/schema-stitching.html>`_
|
||||
by Apollo for detailed guides.
|
After Width: | Height: | Size: 28 KiB |
BIN
docs/img/graphql/manual/business-logic/custom-business-logic.png
Normal file
After Width: | Height: | Size: 19 KiB |
Before Width: | Height: | Size: 23 KiB After Width: | Height: | Size: 23 KiB |
After Width: | Height: | Size: 19 KiB |
BIN
docs/img/graphql/manual/remote-schemas/remote-schemas-arch.png
Normal file
After Width: | Height: | Size: 28 KiB |
After Width: | Height: | Size: 20 KiB |
80
remote-schemas.md
Normal file
@ -0,0 +1,80 @@
|
||||
# Remote schemas
|
||||
|
||||
Merge remote GraphQL schemas with GraphQL Engine's Postgres-based schema to query all your GraphQL types from the same endpoint. Remote schemas are ideal for use cases such as:
|
||||
|
||||
* Customizing mutations (*e.g. running validations before inserts*)
|
||||
* Supporting features like payments, etc. and providing a consistent interface to access them i.e. behind the GraphQL Engine's API
|
||||
* Fetching disparate data from other sources (*e.g. from a weather API or another database*)
|
||||
|
||||
To support custom business logic, you'll need to create a custom GraphQL server (see [boilerplates](community/boilerplates/graphql-servers)) and merge its schema with GraphQL Engine's.
|
||||
|
||||
![remote schems architecture](assets/remote-schemas-arch.png)
|
||||
|
||||
## Demo (*40 seconds*)
|
||||
|
||||
[![video demo of merging remote schemas](https://img.youtube.com/vi/eY4n9aPsi0M/0.jpg)](https://www.youtube.com/watch?v=eY4n9aPsi0M)
|
||||
|
||||
[Merge remote GraphQL schemas (YouTube link)](https://youtu.be/eY4n9aPsi0M)
|
||||
|
||||
## Quickstart
|
||||
|
||||
The fastest way to try remote schema out is via Heroku.
|
||||
|
||||
1. Click on the following button to deploy GraphQL Engine on Heroku with the free Postgres add-on:
|
||||
|
||||
[![Deploy to Heroku](https://www.herokucdn.com/deploy/button.svg)](https://heroku.com/deploy?template=https://github.com/hasura/graphql-engine-heroku)
|
||||
|
||||
2. Open the Hasura console
|
||||
|
||||
Visit `https://<app-name>.herokuapp.com` (*replace \<app-name\> with your app name*) to open the admin console.
|
||||
|
||||
3. Merge your first remote schema and query it
|
||||
|
||||
In the admin console, open the ``Remote Schemas`` tab and click on the ``Add`` button. Fill in the following details:
|
||||
* Remote Schema name: ``countries`` (*an alias for this remote schema*).
|
||||
* GraphQL server URL: ``https://countries.trevorblades.com/`` (*a public GraphQL API that we'll use to quickly check out this feature; maintained by [@trevorblades](https://github.com/trevorblades)*.
|
||||
* Ignore the remaining configuration settings and click on the ``Add Remote Schema`` button.
|
||||
|
||||
Head to the ``GraphiQL` tab and run the following query (*paste it in the query window on the left and click the* ▶️ *(play) button*):
|
||||
|
||||
```graphql
|
||||
{
|
||||
countries {
|
||||
emoji
|
||||
name
|
||||
languages {
|
||||
name
|
||||
native
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
You can explore the GraphQL types from the remote schema using the ``Docs`` explorer in the top right corner of the ``GraphiQL`` interface.
|
||||
|
||||
## Boilerplates
|
||||
|
||||
Boilerplates for custom GraphQL servers in popular languages/frameworks are available.
|
||||
|
||||
* [Regular boilerplates](community/boilerplates/graphql-servers) that can be deployed anywhere.
|
||||
* [Serverless boilerplates](https://github.com/hasura/graphql-serverless) that can deployed on serverless platforms like AWS Lambda, etc.
|
||||
|
||||
Please note that boilerplates for more languages, frameworks, serverless platforms, etc. are being iterated upon and community contributions are very welcome.
|
||||
|
||||
|
||||
## Caveats
|
||||
|
||||
**Current limitations**:
|
||||
|
||||
* Nomenclature: Type names and node names need to be unique across all merged schemas (case-sensitive match). In the next few iterations, support for merging types with the exact same name and structure will be available.
|
||||
* Nodes from different GraphQL servers cannot be used in the same query/mutation. All top-level nodes have to be from the same GraphQL server.
|
||||
* Subscriptions on remote GraphQL server are not supported.
|
||||
* Interfaces are not supported - if a remote schema has interfaces, an error will be thrown if you try to merge it.
|
||||
|
||||
These limitations will be addressed in upcoming versions.
|
||||
|
||||
## Documentation
|
||||
|
||||
Read the complete [documentation](https://docs.hasura.io/1.0/graphql/manual/remote-schemas/index.html).
|
||||
|
||||
|
2
server/.gitignore
vendored
@ -1,6 +1,8 @@
|
||||
__pycache__/
|
||||
dist
|
||||
cabal-dev
|
||||
Pipfile
|
||||
Pipfile.lock
|
||||
*.o
|
||||
*.hi
|
||||
*.chi
|
||||
|
@ -10,6 +10,10 @@ build_dir := $(project_dir)/$(shell stack path --dist-dir)/build
|
||||
|
||||
build_output := /build/_server_output
|
||||
|
||||
dev: src-lib src-exec src-rsr
|
||||
stack build --fast
|
||||
stack exec graphql-engine -- --database-url postgres://postgres:@localhost:5432/hge serve --enable-console
|
||||
|
||||
image: $(project).cabal
|
||||
docker build -t "$(registry)/$(project):$(version)" \
|
||||
-f packaging/Dockerfile \
|
||||
|
@ -144,13 +144,15 @@ library
|
||||
, Hasura.Server.CheckUpdates
|
||||
, Hasura.RQL.Types
|
||||
, Hasura.RQL.Instances
|
||||
, Hasura.RQL.Types.SchemaCache
|
||||
, Hasura.RQL.Types.SchemaCacheTypes
|
||||
, Hasura.RQL.Types.Common
|
||||
, Hasura.RQL.Types.BoolExp
|
||||
, Hasura.RQL.Types.SchemaCache
|
||||
, Hasura.RQL.Types.Permission
|
||||
, Hasura.RQL.Types.Error
|
||||
, Hasura.RQL.Types.DML
|
||||
, Hasura.RQL.Types.Subscribe
|
||||
, Hasura.RQL.Types.RemoteSchema
|
||||
, Hasura.RQL.DDL.Deps
|
||||
, Hasura.RQL.DDL.Permission.Internal
|
||||
, Hasura.RQL.DDL.Permission.Triggers
|
||||
@ -162,6 +164,8 @@ library
|
||||
, Hasura.RQL.DDL.Metadata
|
||||
, Hasura.RQL.DDL.Utils
|
||||
, Hasura.RQL.DDL.Subscribe
|
||||
, Hasura.RQL.DDL.Headers
|
||||
, Hasura.RQL.DDL.RemoteSchema
|
||||
, Hasura.RQL.DML.Delete
|
||||
, Hasura.RQL.DML.Internal
|
||||
, Hasura.RQL.DML.Insert
|
||||
@ -195,10 +199,15 @@ library
|
||||
, Hasura.GraphQL.Resolve.Insert
|
||||
, Hasura.GraphQL.Resolve.Mutation
|
||||
, Hasura.GraphQL.Resolve.Select
|
||||
, Hasura.GraphQL.RemoteServer
|
||||
, Hasura.GraphQL.Context
|
||||
, Hasura.GraphQL.Resolve.ContextTypes
|
||||
|
||||
, Hasura.Events.Lib
|
||||
, Hasura.Events.HTTP
|
||||
|
||||
, Hasura.HTTP.Utils
|
||||
|
||||
, Data.Text.Extended
|
||||
, Data.Sequence.NonEmpty
|
||||
, Data.TByteString
|
||||
@ -212,6 +221,7 @@ library
|
||||
, Hasura.SQL.Rewrite
|
||||
, Hasura.Prelude
|
||||
, Hasura.Logging
|
||||
, Network.URI.Extended
|
||||
, Ops
|
||||
, TH
|
||||
|
||||
|
@ -156,9 +156,9 @@ main = do
|
||||
-- enable console config
|
||||
finalEnableConsole <- bool getEnableConsoleEnv (return True) enableConsole
|
||||
-- init catalog if necessary
|
||||
initialise ci
|
||||
initialise ci httpManager
|
||||
-- migrate catalog if necessary
|
||||
migrate ci
|
||||
migrate ci httpManager
|
||||
prepareEvents ci
|
||||
pool <- Q.initPGPool ci cp
|
||||
putStrLn $ "server: running on port " ++ show port
|
||||
@ -189,7 +189,7 @@ main = do
|
||||
either ((>> exitFailure) . printJSON) (const cleanSuccess) res
|
||||
ROExecute -> do
|
||||
queryBs <- BL.getContents
|
||||
res <- runTx ci $ execQuery queryBs
|
||||
res <- runTx ci $ execQuery httpManager queryBs
|
||||
either ((>> exitFailure) . printJSON) BLC.putStrLn res
|
||||
where
|
||||
runTx ci tx = do
|
||||
@ -198,13 +198,13 @@ main = do
|
||||
getMinimalPool ci = do
|
||||
let connParams = Q.defaultConnParams { Q.cpConns = 1 }
|
||||
Q.initPGPool ci connParams
|
||||
initialise ci = do
|
||||
initialise ci httpMgr = do
|
||||
currentTime <- getCurrentTime
|
||||
res <- runTx ci $ initCatalogSafe currentTime
|
||||
res <- runTx ci $ initCatalogSafe currentTime httpMgr
|
||||
either ((>> exitFailure) . printJSON) putStrLn res
|
||||
migrate ci = do
|
||||
migrate ci httpMgr = do
|
||||
currentTime <- getCurrentTime
|
||||
res <- runTx ci $ migrateCatalog currentTime
|
||||
res <- runTx ci $ migrateCatalog httpMgr currentTime
|
||||
either ((>> exitFailure) . printJSON) putStrLn res
|
||||
prepareEvents ci = do
|
||||
putStrLn "event_triggers: preparing data"
|
||||
|
@ -27,21 +27,22 @@ import qualified Data.Text as T
|
||||
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Database.PG.Query.Connection as Q
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
|
||||
curCatalogVer :: T.Text
|
||||
curCatalogVer = "4"
|
||||
curCatalogVer = "5"
|
||||
|
||||
initCatalogSafe :: UTCTime -> Q.TxE QErr String
|
||||
initCatalogSafe initTime = do
|
||||
initCatalogSafe :: UTCTime -> HTTP.Manager -> Q.TxE QErr String
|
||||
initCatalogSafe initTime httpMgr = do
|
||||
hdbCatalogExists <- Q.catchE defaultTxErrorHandler $
|
||||
doesSchemaExist $ SchemaName "hdb_catalog"
|
||||
bool (initCatalogStrict True initTime) onCatalogExists hdbCatalogExists
|
||||
bool (initCatalogStrict True initTime httpMgr) onCatalogExists hdbCatalogExists
|
||||
where
|
||||
onCatalogExists = do
|
||||
versionExists <- Q.catchE defaultTxErrorHandler $
|
||||
doesVersionTblExist
|
||||
(SchemaName "hdb_catalog") (TableName "hdb_version")
|
||||
bool (initCatalogStrict False initTime) (return initialisedMsg) versionExists
|
||||
bool (initCatalogStrict False initTime httpMgr) (return initialisedMsg) versionExists
|
||||
|
||||
initialisedMsg = "initialise: the state is already initialised"
|
||||
|
||||
@ -62,8 +63,8 @@ initCatalogSafe initTime = do
|
||||
)
|
||||
|] (Identity sn) False
|
||||
|
||||
initCatalogStrict :: Bool -> UTCTime -> Q.TxE QErr String
|
||||
initCatalogStrict createSchema initTime = do
|
||||
initCatalogStrict :: Bool -> UTCTime -> HTTP.Manager -> Q.TxE QErr String
|
||||
initCatalogStrict createSchema initTime httpMgr = do
|
||||
Q.catchE defaultTxErrorHandler $ do
|
||||
|
||||
when createSchema $ do
|
||||
@ -89,7 +90,7 @@ initCatalogStrict createSchema initTime = do
|
||||
return ()
|
||||
|
||||
-- Build the metadata query
|
||||
tx <- liftEither $ buildTxAny adminUserInfo emptySchemaCache metadataQuery
|
||||
tx <- liftEither $ buildTxAny adminUserInfo emptySchemaCache httpMgr metadataQuery
|
||||
|
||||
-- Execute the query
|
||||
void $ snd <$> tx
|
||||
@ -169,14 +170,14 @@ from08To1 = Q.catchE defaultTxErrorHandler $ do
|
||||
json_build_object('type', 'select', 'args', template_defn->'select');
|
||||
|] () False
|
||||
|
||||
from1To2 :: Q.TxE QErr ()
|
||||
from1To2 = do
|
||||
from1To2 :: HTTP.Manager -> Q.TxE QErr ()
|
||||
from1To2 httpMgr = do
|
||||
-- migrate database
|
||||
Q.Discard () <- Q.multiQE defaultTxErrorHandler
|
||||
$(Q.sqlFromFile "src-rsr/migrate_from_1.sql")
|
||||
-- migrate metadata
|
||||
tx <- liftEither $ buildTxAny adminUserInfo
|
||||
emptySchemaCache migrateMetadataFrom1
|
||||
emptySchemaCache httpMgr migrateMetadataFrom1
|
||||
void tx
|
||||
-- set as system defined
|
||||
setAsSystemDefined
|
||||
@ -188,6 +189,19 @@ from2To3 = Q.catchE defaultTxErrorHandler $ do
|
||||
Q.unitQ "CREATE INDEX ON hdb_catalog.event_log (trigger_id)" () False
|
||||
Q.unitQ "CREATE INDEX ON hdb_catalog.event_invocation_logs (event_id)" () False
|
||||
|
||||
-- custom resolver
|
||||
from4To5 :: HTTP.Manager -> Q.TxE QErr ()
|
||||
from4To5 httpMgr = do
|
||||
Q.Discard () <- Q.multiQE defaultTxErrorHandler
|
||||
$(Q.sqlFromFile "src-rsr/migrate_from_4_to_5.sql")
|
||||
-- migrate metadata
|
||||
tx <- liftEither $ buildTxAny adminUserInfo
|
||||
emptySchemaCache httpMgr migrateMetadataFrom4
|
||||
void tx
|
||||
-- set as system defined
|
||||
setAsSystemDefined
|
||||
|
||||
|
||||
from3To4 :: Q.TxE QErr ()
|
||||
from3To4 = Q.catchE defaultTxErrorHandler $ do
|
||||
Q.unitQ "ALTER TABLE hdb_catalog.event_triggers ADD COLUMN configuration JSON" () False
|
||||
@ -204,7 +218,8 @@ from3To4 = Q.catchE defaultTxErrorHandler $ do
|
||||
\, DROP COLUMN retry_interval\
|
||||
\, DROP COLUMN headers" () False
|
||||
where
|
||||
uncurryEventTrigger (trn, Q.AltJ tDef, w, nr, rint, Q.AltJ headers) = EventTriggerConf trn tDef (Just w) Nothing (RetryConf nr rint) headers
|
||||
uncurryEventTrigger (trn, Q.AltJ tDef, w, nr, rint, Q.AltJ headers) =
|
||||
EventTriggerConf trn tDef (Just w) Nothing (RetryConf nr rint) headers
|
||||
updateEventTrigger3To4 etc@(EventTriggerConf name _ _ _ _ _) = Q.unitQ [Q.sql|
|
||||
UPDATE hdb_catalog.event_triggers
|
||||
SET
|
||||
@ -212,8 +227,8 @@ from3To4 = Q.catchE defaultTxErrorHandler $ do
|
||||
WHERE name = $2
|
||||
|] (Q.AltJ $ A.toJSON etc, name) True
|
||||
|
||||
migrateCatalog :: UTCTime -> Q.TxE QErr String
|
||||
migrateCatalog migrationTime = do
|
||||
migrateCatalog :: HTTP.Manager -> UTCTime -> Q.TxE QErr String
|
||||
migrateCatalog httpMgr migrationTime = do
|
||||
preVer <- getCatalogVersion
|
||||
if | preVer == curCatalogVer ->
|
||||
return "migrate: already at the latest version"
|
||||
@ -221,19 +236,24 @@ migrateCatalog migrationTime = do
|
||||
| preVer == "1" -> from1ToCurrent
|
||||
| preVer == "2" -> from2ToCurrent
|
||||
| preVer == "3" -> from3ToCurrent
|
||||
| preVer == "4" -> from4ToCurrent
|
||||
| otherwise -> throw400 NotSupported $
|
||||
"migrate: unsupported version : " <> preVer
|
||||
where
|
||||
from4ToCurrent = do
|
||||
from4To5 httpMgr
|
||||
postMigrate
|
||||
|
||||
from3ToCurrent = do
|
||||
from3To4
|
||||
postMigrate
|
||||
from4ToCurrent
|
||||
|
||||
from2ToCurrent = do
|
||||
from2To3
|
||||
from3ToCurrent
|
||||
|
||||
from1ToCurrent = do
|
||||
from1To2
|
||||
from1To2 httpMgr
|
||||
from2ToCurrent
|
||||
|
||||
from08ToCurrent = do
|
||||
@ -246,7 +266,7 @@ migrateCatalog migrationTime = do
|
||||
-- clean hdb_views
|
||||
Q.catchE defaultTxErrorHandler clearHdbViews
|
||||
-- try building the schema cache
|
||||
void buildSchemaCache
|
||||
void $ buildSchemaCache httpMgr
|
||||
return $ "migrate: successfully migrated to " ++ show curCatalogVer
|
||||
|
||||
updateVersion =
|
||||
@ -256,13 +276,14 @@ migrateCatalog migrationTime = do
|
||||
"upgraded_on" = $2
|
||||
|] (curCatalogVer, migrationTime) False
|
||||
|
||||
execQuery :: BL.ByteString -> Q.TxE QErr BL.ByteString
|
||||
execQuery queryBs = do
|
||||
execQuery :: HTTP.Manager -> BL.ByteString -> Q.TxE QErr BL.ByteString
|
||||
execQuery httpMgr queryBs = do
|
||||
query <- case A.decode queryBs of
|
||||
Just jVal -> decodeValue jVal
|
||||
Nothing -> throw400 InvalidJSON "invalid json"
|
||||
schemaCache <- buildSchemaCache
|
||||
tx <- liftEither $ buildTxAny adminUserInfo schemaCache query
|
||||
schemaCache <- buildSchemaCache httpMgr
|
||||
tx <- liftEither $ buildTxAny adminUserInfo schemaCache
|
||||
httpMgr query
|
||||
fst <$> tx
|
||||
|
||||
|
||||
|
@ -6,6 +6,7 @@
|
||||
module TH
|
||||
( metadataQuery
|
||||
, migrateMetadataFrom1
|
||||
, migrateMetadataFrom4
|
||||
) where
|
||||
|
||||
import Language.Haskell.TH.Syntax (Q, TExp, unTypeQ)
|
||||
@ -20,3 +21,5 @@ metadataQuery = $(unTypeQ (Y.decodeFile "src-rsr/hdb_metadata.yaml" :: Q (TExp R
|
||||
migrateMetadataFrom1 :: RQLQuery
|
||||
migrateMetadataFrom1 = $(unTypeQ (Y.decodeFile "src-rsr/migrate_metadata_from_1.yaml" :: Q (TExp RQLQuery)))
|
||||
|
||||
migrateMetadataFrom4 :: RQLQuery
|
||||
migrateMetadataFrom4 = $(unTypeQ (Y.decodeFile "src-rsr/migrate_metadata_from_4_to_5.yaml" :: Q (TExp RQLQuery)))
|
||||
|
@ -48,6 +48,7 @@ import Data.Has
|
||||
import Hasura.Logging
|
||||
-- import Data.Monoid
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.DDL.Headers
|
||||
import Hasura.RQL.Types.Subscribe
|
||||
|
||||
-- import Context (HTTPSessionMgr (..))
|
||||
|
@ -26,6 +26,7 @@ import Data.IORef (IORef, readIORef)
|
||||
import Data.Time.Clock
|
||||
import Hasura.Events.HTTP
|
||||
import Hasura.Prelude
|
||||
import Hasura.RQL.DDL.Headers
|
||||
import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
|
||||
@ -41,7 +42,6 @@ import qualified Data.Text.Encoding as TE
|
||||
import qualified Data.Text.Encoding.Error as TE
|
||||
import qualified Data.Time.Clock as Time
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Hasura.GraphQL.Schema as GS
|
||||
import qualified Hasura.Logging as L
|
||||
import qualified Network.HTTP.Types as N
|
||||
import qualified Network.Wreq as W
|
||||
@ -54,7 +54,7 @@ invocationVersion = "2"
|
||||
|
||||
type LogEnvHeaders = Bool
|
||||
|
||||
type CacheRef = IORef (SchemaCache, GS.GCtxMap)
|
||||
type CacheRef = IORef SchemaCache
|
||||
|
||||
newtype EventInternalErr
|
||||
= EventInternalErr QErr
|
||||
@ -232,7 +232,7 @@ processEvent logenv pool e = do
|
||||
checkError err = do
|
||||
let mretryHeader = getRetryAfterHeaderFromError err
|
||||
cacheRef::CacheRef <- asks getter
|
||||
(cache, _) <- liftIO $ readIORef cacheRef
|
||||
cache <- liftIO $ readIORef cacheRef
|
||||
let eti = getEventTriggerInfoFromEvent cache e
|
||||
retryConfM = etiRetryConf <$> eti
|
||||
retryConf = fromMaybe (RetryConf 0 10) retryConfM
|
||||
@ -274,7 +274,7 @@ tryWebhook
|
||||
tryWebhook logenv pool e = do
|
||||
logger:: HLogger <- asks getter
|
||||
cacheRef::CacheRef <- asks getter
|
||||
(cache, _) <- liftIO $ readIORef cacheRef
|
||||
cache <- liftIO $ readIORef cacheRef
|
||||
let meti = getEventTriggerInfoFromEvent cache e
|
||||
case meti of
|
||||
Nothing -> return $ Left $ HOther "table or event-trigger not found"
|
||||
|
310
server/src-lib/Hasura/GraphQL/Context.hs
Normal file
@ -0,0 +1,310 @@
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE MultiParamTypeClasses #-}
|
||||
{-# LANGUAGE MultiWayIf #-}
|
||||
{-# LANGUAGE NoImplicitPrelude #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
|
||||
module Hasura.GraphQL.Context where
|
||||
|
||||
import Data.Aeson
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.HashSet as Set
|
||||
import qualified Data.Text as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.ContextTypes
|
||||
import Hasura.GraphQL.Validate.Types
|
||||
import Hasura.RQL.Types.BoolExp
|
||||
import Hasura.RQL.Types.Common
|
||||
import Hasura.RQL.Types.Permission
|
||||
import Hasura.SQL.Types
|
||||
|
||||
|
||||
type OpCtxMap = Map.HashMap G.Name OpCtx
|
||||
|
||||
data OpCtx
|
||||
-- table, req hdrs
|
||||
= OCInsert QualifiedTable [T.Text]
|
||||
-- tn, filter exp, limit, req hdrs
|
||||
| OCSelect QualifiedTable AnnBoolExpSQL (Maybe Int) [T.Text]
|
||||
-- tn, filter exp, reqt hdrs
|
||||
| OCSelectPkey QualifiedTable AnnBoolExpSQL [T.Text]
|
||||
-- tn, filter exp, limit, req hdrs
|
||||
| OCSelectAgg QualifiedTable AnnBoolExpSQL (Maybe Int) [T.Text]
|
||||
-- tn, filter exp, req hdrs
|
||||
| OCUpdate QualifiedTable AnnBoolExpSQL [T.Text]
|
||||
-- tn, filter exp, req hdrs
|
||||
| OCDelete QualifiedTable AnnBoolExpSQL [T.Text]
|
||||
deriving (Show, Eq)
|
||||
|
||||
data GCtx
|
||||
= GCtx
|
||||
{ _gTypes :: !TypeMap
|
||||
, _gFields :: !FieldMap
|
||||
, _gOrdByCtx :: !OrdByCtx
|
||||
, _gQueryRoot :: !ObjTyInfo
|
||||
, _gMutRoot :: !(Maybe ObjTyInfo)
|
||||
, _gSubRoot :: !(Maybe ObjTyInfo)
|
||||
, _gOpCtxMap :: !OpCtxMap
|
||||
, _gInsCtxMap :: !InsCtxMap
|
||||
} deriving (Show, Eq)
|
||||
|
||||
instance Has TypeMap GCtx where
|
||||
getter = _gTypes
|
||||
modifier f ctx = ctx { _gTypes = f $ _gTypes ctx }
|
||||
|
||||
-- data OpCtx
|
||||
-- -- table, req hdrs
|
||||
-- = OCInsert QualifiedTable [T.Text]
|
||||
-- -- tn, filter exp, limit, req hdrs
|
||||
-- | OCSelect QualifiedTable S.BoolExp (Maybe Int) [T.Text]
|
||||
-- -- tn, filter exp, reqt hdrs
|
||||
-- | OCSelectPkey QualifiedTable S.BoolExp [T.Text]
|
||||
-- -- tn, filter exp, limit, req hdrs
|
||||
-- | OCSelectAgg QualifiedTable S.BoolExp (Maybe Int) [T.Text]
|
||||
-- -- tn, filter exp, req hdrs
|
||||
-- | OCUpdate QualifiedTable S.BoolExp [T.Text]
|
||||
-- -- tn, filter exp, req hdrs
|
||||
-- | OCDelete QualifiedTable S.BoolExp [T.Text]
|
||||
-- deriving (Show, Eq)
|
||||
|
||||
-- data GCtx
|
||||
-- = GCtx
|
||||
-- { _gTypes :: !TypeMap
|
||||
-- , _gFields :: !FieldMap
|
||||
-- , _gOrdByCtx :: !OrdByCtx
|
||||
-- , _gQueryRoot :: !ObjTyInfo
|
||||
-- , _gMutRoot :: !(Maybe ObjTyInfo)
|
||||
-- , _gSubRoot :: !(Maybe ObjTyInfo)
|
||||
-- , _gOpCtxMap :: !OpCtxMap
|
||||
-- , _gInsCtxMap :: !InsCtxMap
|
||||
-- } deriving (Show, Eq)
|
||||
|
||||
-- instance Has TypeMap GCtx where
|
||||
-- getter = _gTypes
|
||||
-- modifier f ctx = ctx { _gTypes = f $ _gTypes ctx }
|
||||
|
||||
instance ToJSON GCtx where
|
||||
toJSON _ = String "GCtx"
|
||||
|
||||
type GCtxMap = Map.HashMap RoleName GCtx
|
||||
|
||||
data TyAgg
|
||||
= TyAgg
|
||||
{ _taTypes :: !TypeMap
|
||||
, _taFields :: !FieldMap
|
||||
, _taOrdBy :: !OrdByCtx
|
||||
} deriving (Show, Eq)
|
||||
|
||||
instance Semigroup TyAgg where
|
||||
(TyAgg t1 f1 o1) <> (TyAgg t2 f2 o2) =
|
||||
TyAgg (Map.union t1 t2) (Map.union f1 f2) (Map.union o1 o2)
|
||||
|
||||
instance Monoid TyAgg where
|
||||
mempty = TyAgg Map.empty Map.empty Map.empty
|
||||
mappend = (<>)
|
||||
|
||||
newtype RootFlds
|
||||
= RootFlds
|
||||
{ _taMutation :: Map.HashMap G.Name (OpCtx, Either ObjFldInfo ObjFldInfo)
|
||||
} deriving (Show, Eq)
|
||||
|
||||
instance Semigroup RootFlds where
|
||||
(RootFlds m1) <> (RootFlds m2)
|
||||
= RootFlds (Map.union m1 m2)
|
||||
|
||||
instance Monoid RootFlds where
|
||||
mempty = RootFlds Map.empty
|
||||
mappend = (<>)
|
||||
|
||||
mkHsraObjFldInfo
|
||||
:: Maybe G.Description
|
||||
-> G.Name
|
||||
-> ParamMap
|
||||
-> G.GType
|
||||
-> ObjFldInfo
|
||||
mkHsraObjFldInfo descM name params ty =
|
||||
ObjFldInfo descM name params ty HasuraType
|
||||
|
||||
mkHsraObjTyInfo
|
||||
:: Maybe G.Description
|
||||
-> G.NamedType
|
||||
-> ObjFieldMap
|
||||
-> ObjTyInfo
|
||||
mkHsraObjTyInfo descM ty flds =
|
||||
mkObjTyInfo descM ty flds HasuraType
|
||||
|
||||
mkHsraInpTyInfo
|
||||
:: Maybe G.Description
|
||||
-> G.NamedType
|
||||
-> InpObjFldMap
|
||||
-> InpObjTyInfo
|
||||
mkHsraInpTyInfo descM ty flds =
|
||||
InpObjTyInfo descM ty flds HasuraType
|
||||
|
||||
mkHsraEnumTyInfo
|
||||
:: Maybe G.Description
|
||||
-> G.NamedType
|
||||
-> Map.HashMap G.EnumValue EnumValInfo
|
||||
-> EnumTyInfo
|
||||
mkHsraEnumTyInfo descM ty enumVals =
|
||||
EnumTyInfo descM ty enumVals HasuraType
|
||||
|
||||
mkHsraScalarTyInfo :: PGColType -> ScalarTyInfo
|
||||
mkHsraScalarTyInfo ty = ScalarTyInfo Nothing ty HasuraType
|
||||
|
||||
fromInpValL :: [InpValInfo] -> Map.HashMap G.Name InpValInfo
|
||||
fromInpValL = mapFromL _iviName
|
||||
|
||||
mkCompExpName :: PGColType -> G.Name
|
||||
mkCompExpName pgColTy =
|
||||
G.Name $ T.pack (show pgColTy) <> "_comparison_exp"
|
||||
|
||||
mkCompExpTy :: PGColType -> G.NamedType
|
||||
mkCompExpTy =
|
||||
G.NamedType . mkCompExpName
|
||||
|
||||
--- | make compare expression input type
|
||||
mkCompExpInp :: PGColType -> InpObjTyInfo
|
||||
mkCompExpInp colTy =
|
||||
InpObjTyInfo (Just tyDesc) (mkCompExpTy colTy) (fromInpValL $ concat
|
||||
[ map (mk colScalarTy) typedOps
|
||||
, map (mk $ G.toLT colScalarTy) listOps
|
||||
, bool [] (map (mk $ mkScalarTy PGText) stringOps) isStringTy
|
||||
, bool [] (map jsonbOpToInpVal jsonbOps) isJsonbTy
|
||||
, [InpValInfo Nothing "_is_null" $ G.TypeNamed (G.Nullability True) $ G.NamedType "Boolean"]
|
||||
]) HasuraType
|
||||
where
|
||||
tyDesc = mconcat
|
||||
[ "expression to compare columns of type "
|
||||
, G.Description (T.pack $ show colTy)
|
||||
, ". All fields are combined with logical 'AND'."
|
||||
]
|
||||
isStringTy = case colTy of
|
||||
PGVarchar -> True
|
||||
PGText -> True
|
||||
_ -> False
|
||||
mk t n = InpValInfo Nothing n $ G.toGT t
|
||||
colScalarTy = mkScalarTy colTy
|
||||
-- colScalarListTy = GA.GTList colGTy
|
||||
typedOps =
|
||||
["_eq", "_neq", "_gt", "_lt", "_gte", "_lte"]
|
||||
listOps =
|
||||
[ "_in", "_nin" ]
|
||||
-- TODO
|
||||
-- columnOps =
|
||||
-- [ "_ceq", "_cneq", "_cgt", "_clt", "_cgte", "_clte"]
|
||||
stringOps =
|
||||
[ "_like", "_nlike", "_ilike", "_nilike"
|
||||
, "_similar", "_nsimilar"
|
||||
]
|
||||
isJsonbTy = case colTy of
|
||||
PGJSONB -> True
|
||||
_ -> False
|
||||
jsonbOpToInpVal (op, ty, desc) = InpValInfo (Just desc) op ty
|
||||
jsonbOps =
|
||||
[ ( "_contains"
|
||||
, G.toGT $ mkScalarTy PGJSONB
|
||||
, "does the column contain the given json value at the top level"
|
||||
)
|
||||
, ( "_contained_in"
|
||||
, G.toGT $ mkScalarTy PGJSONB
|
||||
, "is the column contained in the given json value"
|
||||
)
|
||||
, ( "_has_key"
|
||||
, G.toGT $ mkScalarTy PGText
|
||||
, "does the string exist as a top-level key in the column"
|
||||
)
|
||||
, ( "_has_keys_any"
|
||||
, G.toGT $ G.toLT $ G.toNT $ mkScalarTy PGText
|
||||
, "do any of these strings exist as top-level keys in the column"
|
||||
)
|
||||
, ( "_has_keys_all"
|
||||
, G.toGT $ G.toLT $ G.toNT $ mkScalarTy PGText
|
||||
, "do all of these strings exist as top-level keys in the column"
|
||||
)
|
||||
]
|
||||
|
||||
ordByTy :: G.NamedType
|
||||
ordByTy = G.NamedType "order_by"
|
||||
|
||||
ordByEnumTy :: EnumTyInfo
|
||||
ordByEnumTy =
|
||||
mkHsraEnumTyInfo (Just desc) ordByTy $ mapFromL _eviVal $
|
||||
map mkEnumVal enumVals
|
||||
where
|
||||
desc = G.Description "column ordering options"
|
||||
mkEnumVal (n, d) =
|
||||
EnumValInfo (Just d) (G.EnumValue n) False
|
||||
enumVals =
|
||||
[ ( "asc"
|
||||
, "in the ascending order, nulls last"
|
||||
),
|
||||
( "asc_nulls_last"
|
||||
, "in the ascending order, nulls last"
|
||||
),
|
||||
( "asc_nulls_first"
|
||||
, "in the ascending order, nulls first"
|
||||
),
|
||||
( "desc"
|
||||
, "in the descending order, nulls first"
|
||||
),
|
||||
( "desc_nulls_first"
|
||||
, "in the descending order, nulls first"
|
||||
),
|
||||
( "desc_nulls_last"
|
||||
, "in the descending order, nulls last"
|
||||
)
|
||||
]
|
||||
|
||||
defaultTypes :: [TypeInfo]
|
||||
defaultTypes = $(fromSchemaDocQ defaultSchema HasuraType)
|
||||
|
||||
|
||||
mkGCtx :: TyAgg -> RootFlds -> InsCtxMap -> GCtx
|
||||
mkGCtx (TyAgg tyInfos fldInfos ordByEnums) (RootFlds flds) insCtxMap =
|
||||
let queryRoot = mkHsraObjTyInfo (Just "query root")
|
||||
(G.NamedType "query_root") $
|
||||
mapFromL _fiName (schemaFld:typeFld:qFlds)
|
||||
colTys = Set.toList $ Set.fromList $ map pgiType $
|
||||
lefts $ Map.elems fldInfos
|
||||
scalarTys = map (TIScalar . mkHsraScalarTyInfo) colTys
|
||||
compTys = map (TIInpObj . mkCompExpInp) colTys
|
||||
ordByEnumTyM = bool (Just ordByEnumTy) Nothing $ null qFlds
|
||||
allTys = Map.union tyInfos $ mkTyInfoMap $
|
||||
catMaybes [ Just $ TIObj queryRoot
|
||||
, TIObj <$> mutRootM
|
||||
, TIObj <$> subRootM
|
||||
, TIEnum <$> ordByEnumTyM
|
||||
] <>
|
||||
scalarTys <> compTys <> defaultTypes
|
||||
-- for now subscription root is query root
|
||||
in GCtx allTys fldInfos ordByEnums queryRoot mutRootM (Just queryRoot)
|
||||
(Map.map fst flds) insCtxMap
|
||||
where
|
||||
mkMutRoot =
|
||||
mkHsraObjTyInfo (Just "mutation root") (G.NamedType "mutation_root") .
|
||||
mapFromL _fiName
|
||||
mutRootM = bool (Just $ mkMutRoot mFlds) Nothing $ null mFlds
|
||||
mkSubRoot =
|
||||
mkHsraObjTyInfo (Just "subscription root")
|
||||
(G.NamedType "subscription_root") . mapFromL _fiName
|
||||
subRootM = bool (Just $ mkSubRoot qFlds) Nothing $ null qFlds
|
||||
(qFlds, mFlds) = partitionEithers $ map snd $ Map.elems flds
|
||||
schemaFld = mkHsraObjFldInfo Nothing "__schema" Map.empty $
|
||||
G.toGT $ G.toNT $ G.NamedType "__Schema"
|
||||
typeFld = mkHsraObjFldInfo Nothing "__type" typeFldArgs $
|
||||
G.toGT $ G.NamedType "__Type"
|
||||
where
|
||||
typeFldArgs = mapFromL _iviName [
|
||||
InpValInfo (Just "name of the type") "name"
|
||||
$ G.toGT $ G.toNT $ G.NamedType "String"
|
||||
]
|
||||
|
||||
emptyGCtx :: GCtx
|
||||
emptyGCtx = mkGCtx mempty mempty mempty
|
@ -11,11 +11,11 @@ module Hasura.GraphQL.Explain
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Text.Builder as TB
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Schema
|
||||
@ -27,8 +27,10 @@ import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
import qualified Hasura.GraphQL.Resolve.Select as RS
|
||||
import qualified Hasura.GraphQL.Transport.HTTP as TH
|
||||
import qualified Hasura.GraphQL.Transport.HTTP.Protocol as GH
|
||||
import qualified Hasura.GraphQL.Validate as GV
|
||||
import qualified Hasura.GraphQL.Validate.Types as VT
|
||||
import qualified Hasura.RQL.DML.Select as RS
|
||||
import qualified Hasura.Server.Query as RQ
|
||||
import qualified Hasura.SQL.DML as S
|
||||
@ -112,20 +114,34 @@ explainGQLQuery
|
||||
:: (MonadError QErr m, MonadIO m)
|
||||
=> Q.PGPool
|
||||
-> Q.TxIsolation
|
||||
-> GCtxMap
|
||||
-> SchemaCache
|
||||
-> GQLExplain
|
||||
-> m BL.ByteString
|
||||
explainGQLQuery pool iso gCtxMap (GQLExplain query userVarsRaw)= do
|
||||
(opTy, selSet) <- runReaderT (GV.validateGQ query) gCtx
|
||||
explainGQLQuery pool iso sc (GQLExplain query userVarsRaw)= do
|
||||
(gCtx, _) <- flip runStateT sc $ getGCtx (userRole userInfo) gCtxMap
|
||||
queryParts <- runReaderT (GV.getQueryParts query) gCtx
|
||||
let topLevelNodes = TH.getTopLevelNodes (GV.qpOpDef queryParts)
|
||||
|
||||
unless (allHasuraNodes gCtx topLevelNodes) $
|
||||
throw400 InvalidParams "only hasura queries can be explained"
|
||||
|
||||
(opTy, selSet) <- runReaderT (GV.validateGQ queryParts) gCtx
|
||||
unless (opTy == G.OperationTypeQuery) $
|
||||
throw400 InvalidParams "only queries can be explained"
|
||||
let tx = mapM (explainField userInfo gCtx) (toList selSet)
|
||||
plans <- liftIO (runExceptT $ runTx tx) >>= liftEither
|
||||
return $ J.encode plans
|
||||
where
|
||||
gCtxMap = scGCtxMap sc
|
||||
usrVars = mkUserVars $ maybe [] Map.toList userVarsRaw
|
||||
userInfo = mkUserInfo (fromMaybe adminRole $ roleFromVars usrVars) usrVars
|
||||
gCtx = getGCtx (userRole userInfo) gCtxMap
|
||||
runTx tx =
|
||||
Q.runTx pool (iso, Nothing) $
|
||||
RQ.setHeadersTx (userVars userInfo) >> tx
|
||||
|
||||
allHasuraNodes gCtx nodes =
|
||||
let typeLocs = TH.gatherTypeLocs gCtx nodes
|
||||
isHasuraNode = \case
|
||||
VT.HasuraType -> True
|
||||
VT.RemoteType _ _ -> False
|
||||
in all isHasuraNode typeLocs
|
||||
|
404
server/src-lib/Hasura/GraphQL/RemoteServer.hs
Normal file
@ -0,0 +1,404 @@
|
||||
{-# LANGUAGE DeriveGeneric #-}
|
||||
{-# LANGUAGE FlexibleContexts #-}
|
||||
{-# LANGUAGE FlexibleInstances #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE QuasiQuotes #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
|
||||
module Hasura.GraphQL.RemoteServer where
|
||||
|
||||
import Control.Exception (try)
|
||||
import Control.Lens ((^.))
|
||||
import Data.Aeson ((.:), (.:?))
|
||||
import Data.FileEmbed (embedStringFile)
|
||||
import Data.Foldable (foldlM)
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.CaseInsensitive as CI
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.Encoding as T
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Network.HTTP.Client as HTTP
|
||||
import qualified Network.Wreq as Wreq
|
||||
|
||||
import Hasura.HTTP.Utils (wreqOptions)
|
||||
import Hasura.RQL.DDL.Headers (getHeadersFromConf)
|
||||
import Hasura.RQL.Types
|
||||
|
||||
import qualified Hasura.GraphQL.Schema as GS
|
||||
import qualified Hasura.GraphQL.Validate.Types as VT
|
||||
|
||||
|
||||
introspectionQuery :: BL.ByteString
|
||||
introspectionQuery = $(embedStringFile "src-rsr/introspection.json")
|
||||
|
||||
fetchRemoteSchema
|
||||
:: (MonadIO m, MonadError QErr m)
|
||||
=> HTTP.Manager
|
||||
-> RemoteSchemaName
|
||||
-> RemoteSchemaInfo
|
||||
-> m GS.RemoteGCtx
|
||||
fetchRemoteSchema manager name def@(RemoteSchemaInfo url headerConf _) = do
|
||||
headers <- getHeadersFromConf headerConf
|
||||
let hdrs = map (\(hn, hv) -> (CI.mk . T.encodeUtf8 $ hn, T.encodeUtf8 hv)) headers
|
||||
options = wreqOptions manager hdrs
|
||||
res <- liftIO $ try $ Wreq.postWith options (show url) introspectionQuery
|
||||
resp <- either throwHttpErr return res
|
||||
|
||||
let respData = resp ^. Wreq.responseBody
|
||||
statusCode = resp ^. Wreq.responseStatus . Wreq.statusCode
|
||||
when (statusCode /= 200) $ schemaErr respData
|
||||
|
||||
introspectRes :: (FromIntrospection IntrospectionResult) <-
|
||||
either schemaErr return $ J.eitherDecode respData
|
||||
let (G.SchemaDocument tyDefs, qRootN, mRootN, _) =
|
||||
fromIntrospection introspectRes
|
||||
let etTypeInfos = mapM fromRemoteTyDef tyDefs
|
||||
typeInfos <- either schemaErr return etTypeInfos
|
||||
let typMap = VT.mkTyInfoMap typeInfos
|
||||
mQrTyp = Map.lookup qRootN typMap
|
||||
mMrTyp = maybe Nothing (\mr -> Map.lookup mr typMap) mRootN
|
||||
qrTyp <- liftMaybe noQueryRoot mQrTyp
|
||||
let mRmQR = VT.getObjTyM qrTyp
|
||||
mRmMR = join $ VT.getObjTyM <$> mMrTyp
|
||||
rmQR <- liftMaybe (err400 Unexpected "query root has to be an object type") mRmQR
|
||||
return $ GS.RemoteGCtx typMap rmQR mRmMR Nothing
|
||||
|
||||
where
|
||||
noQueryRoot = err400 Unexpected "query root not found in remote schema"
|
||||
fromRemoteTyDef ty = VT.fromTyDef ty $ VT.RemoteType name def
|
||||
schemaErr err = throw400 RemoteSchemaError (T.pack $ show err)
|
||||
|
||||
throwHttpErr :: (MonadError QErr m) => HTTP.HttpException -> m a
|
||||
throwHttpErr = schemaErr
|
||||
|
||||
mergeSchemas
|
||||
:: (MonadIO m, MonadError QErr m)
|
||||
=> RemoteSchemaMap
|
||||
-> GS.GCtxMap
|
||||
-> HTTP.Manager
|
||||
-> m (GS.GCtxMap, GS.GCtx) -- the merged GCtxMap and the default GCtx without roles
|
||||
mergeSchemas rmSchemaMap gCtxMap httpManager = do
|
||||
remoteSchemas <- forM (Map.toList rmSchemaMap) $ \(name, def) ->
|
||||
fetchRemoteSchema httpManager name def
|
||||
def <- mkDefaultRemoteGCtx remoteSchemas
|
||||
merged <- mergeRemoteSchema gCtxMap def
|
||||
return (merged, def)
|
||||
|
||||
mkDefaultRemoteGCtx
|
||||
:: (MonadError QErr m)
|
||||
=> [GS.RemoteGCtx] -> m GS.GCtx
|
||||
mkDefaultRemoteGCtx =
|
||||
foldlM (\combG -> mergeGCtx combG . convRemoteGCtx) GS.emptyGCtx
|
||||
|
||||
mergeRemoteSchema
|
||||
:: (MonadError QErr m)
|
||||
=> GS.GCtxMap
|
||||
-> GS.GCtx
|
||||
-> m GS.GCtxMap
|
||||
mergeRemoteSchema ctxMap mergedRemoteGCtx = do
|
||||
res <- forM (Map.toList ctxMap) $ \(role, gCtx) -> do
|
||||
updatedGCtx <- mergeGCtx gCtx mergedRemoteGCtx
|
||||
return (role, updatedGCtx)
|
||||
return $ Map.fromList res
|
||||
|
||||
mergeGCtx
|
||||
:: (MonadError QErr m)
|
||||
=> GS.GCtx
|
||||
-> GS.GCtx
|
||||
-> m GS.GCtx
|
||||
mergeGCtx gCtx rmMergedGCtx = do
|
||||
let rmTypes = GS._gTypes rmMergedGCtx
|
||||
hsraTyMap = GS._gTypes gCtx
|
||||
GS.checkSchemaConflicts gCtx rmMergedGCtx
|
||||
let newQR = mergeQueryRoot gCtx rmMergedGCtx
|
||||
newMR = mergeMutRoot gCtx rmMergedGCtx
|
||||
newTyMap = mergeTyMaps hsraTyMap rmTypes newQR newMR
|
||||
updatedGCtx = gCtx { GS._gTypes = newTyMap
|
||||
, GS._gQueryRoot = newQR
|
||||
, GS._gMutRoot = newMR
|
||||
}
|
||||
return updatedGCtx
|
||||
|
||||
convRemoteGCtx :: GS.RemoteGCtx -> GS.GCtx
|
||||
convRemoteGCtx rmGCtx =
|
||||
GS.emptyGCtx { GS._gTypes = GS._rgTypes rmGCtx
|
||||
, GS._gQueryRoot = GS._rgQueryRoot rmGCtx
|
||||
, GS._gMutRoot = GS._rgMutationRoot rmGCtx
|
||||
}
|
||||
|
||||
|
||||
mergeQueryRoot :: GS.GCtx -> GS.GCtx -> VT.ObjTyInfo
|
||||
mergeQueryRoot a b = GS._gQueryRoot a <> GS._gQueryRoot b
|
||||
|
||||
mergeMutRoot :: GS.GCtx -> GS.GCtx -> Maybe VT.ObjTyInfo
|
||||
mergeMutRoot a b =
|
||||
let objA' = fromMaybe mempty $ GS._gMutRoot a
|
||||
objB = fromMaybe mempty $ GS._gMutRoot b
|
||||
objA = newRootOrEmpty objA' objB
|
||||
merged = objA <> objB
|
||||
in bool (Just merged) Nothing $ merged == mempty
|
||||
where
|
||||
newRootOrEmpty x y =
|
||||
if x == mempty && y /= mempty
|
||||
then mkNewEmptyMutRoot
|
||||
else x
|
||||
|
||||
mkNewEmptyMutRoot :: VT.ObjTyInfo
|
||||
mkNewEmptyMutRoot = VT.ObjTyInfo (Just "mutation root")
|
||||
(G.NamedType "mutation_root") Map.empty
|
||||
|
||||
mkNewMutRoot :: VT.ObjFieldMap -> VT.ObjTyInfo
|
||||
mkNewMutRoot flds = VT.ObjTyInfo (Just "mutation root")
|
||||
(G.NamedType "mutation_root") flds
|
||||
|
||||
mergeTyMaps
|
||||
:: VT.TypeMap
|
||||
-> VT.TypeMap
|
||||
-> VT.ObjTyInfo
|
||||
-> Maybe VT.ObjTyInfo
|
||||
-> VT.TypeMap
|
||||
mergeTyMaps hTyMap rmTyMap newQR newMR =
|
||||
let newTyMap = hTyMap <> rmTyMap
|
||||
newTyMap' = Map.insert (G.NamedType "query_root") (VT.TIObj newQR) $
|
||||
newTyMap
|
||||
in maybe newTyMap' (\mr -> Map.insert
|
||||
(G.NamedType "mutation_root")
|
||||
(VT.TIObj mr) newTyMap') newMR
|
||||
|
||||
|
||||
-- parsing the introspection query result
|
||||
|
||||
newtype FromIntrospection a
|
||||
= FromIntrospection { fromIntrospection :: a }
|
||||
deriving (Show, Eq, Generic)
|
||||
|
||||
pErr :: (Monad m) => Text -> m a
|
||||
pErr = fail . T.unpack
|
||||
|
||||
kindErr :: (Monad m) => Text -> Text -> m a
|
||||
kindErr gKind eKind = pErr $ "Invalid `kind: " <> gKind <> "` in " <> eKind
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.Description) where
|
||||
parseJSON = fmap (FromIntrospection . G.Description) . J.parseJSON
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.ScalarTypeDefinition) where
|
||||
parseJSON = J.withObject "ScalarTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
when (kind /= "SCALAR") $ kindErr kind "scalar"
|
||||
let desc' = fmap fromIntrospection desc
|
||||
r = G.ScalarTypeDefinition desc' name []
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.ObjectTypeDefinition) where
|
||||
parseJSON = J.withObject "ObjectTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
fields <- o .:? "fields"
|
||||
interfaces <- o .:? "interfaces"
|
||||
when (kind /= "OBJECT") $ kindErr kind "object"
|
||||
let implIfaces = map (G.NamedType . G._itdName) $
|
||||
maybe [] (fmap fromIntrospection) interfaces
|
||||
flds = maybe [] (fmap fromIntrospection) fields
|
||||
desc' = fmap fromIntrospection desc
|
||||
r = G.ObjectTypeDefinition desc' name implIfaces [] flds
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.FieldDefinition) where
|
||||
parseJSON = J.withObject "FieldDefinition" $ \o -> do
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
args <- o .: "args"
|
||||
_type <- o .: "type"
|
||||
let desc' = fmap fromIntrospection desc
|
||||
r = G.FieldDefinition desc' name (fmap fromIntrospection args)
|
||||
(fromIntrospection _type) []
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.GType) where
|
||||
parseJSON = J.withObject "GType" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
mName <- o .:? "name"
|
||||
mType <- o .:? "ofType"
|
||||
r <- case (kind, mName, mType) of
|
||||
("NON_NULL", _, Just typ) -> return $ mkNotNull (fromIntrospection typ)
|
||||
("NON_NULL", _, Nothing) -> pErr "NON_NULL should have `ofType`"
|
||||
("LIST", _, Just typ) ->
|
||||
return $ G.TypeList (G.Nullability True)
|
||||
(G.ListType $ fromIntrospection typ)
|
||||
("LIST", _, Nothing) -> pErr "LIST should have `ofType`"
|
||||
(_, Just name, _) -> return $ G.TypeNamed (G.Nullability True) name
|
||||
_ -> pErr $ "kind: " <> kind <> " should have name"
|
||||
return $ FromIntrospection r
|
||||
|
||||
where
|
||||
mkNotNull typ = case typ of
|
||||
G.TypeList _ ty -> G.TypeList (G.Nullability False) ty
|
||||
G.TypeNamed _ n -> G.TypeNamed (G.Nullability False) n
|
||||
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.InputValueDefinition) where
|
||||
parseJSON = J.withObject "InputValueDefinition" $ \o -> do
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
_type <- o .: "type"
|
||||
--defValue <- o .: "defaultValue"
|
||||
let desc' = fmap fromIntrospection desc
|
||||
r = G.InputValueDefinition desc' name (fromIntrospection _type) Nothing
|
||||
return $ FromIntrospection r
|
||||
|
||||
|
||||
-- instance J.FromJSON (FromIntrospection G.ListType) where
|
||||
-- parseJSON = parseJSON
|
||||
|
||||
-- instance (J.FromJSON (G.ObjectFieldG a)) =>
|
||||
-- J.FromJSON (FromIntrospection (G.ObjectValueG a)) where
|
||||
-- parseJSON = fmap (FromIntrospection . G.ObjectValueG) . J.parseJSON
|
||||
|
||||
-- instance (J.FromJSON a) => J.FromJSON (FromIntrospection (G.ObjectFieldG a)) where
|
||||
-- parseJSON = J.withObject "ObjectValueG a" $ \o -> do
|
||||
-- name <- o .: "name"
|
||||
-- ofVal <- o .: "value"
|
||||
-- return $ FromIntrospection $ G.ObjectFieldG name ofVal
|
||||
|
||||
-- instance J.FromJSON (FromIntrospection G.ValueConst) where
|
||||
-- parseJSON =
|
||||
-- fmap FromIntrospection .
|
||||
-- $(J.mkParseJSON J.defaultOptions{J.sumEncoding=J.UntaggedValue} ''G.ValueConst)
|
||||
|
||||
-- instance J.FromJSON (FromIntrospection G.Value) where
|
||||
-- parseJSON =
|
||||
-- fmap FromIntrospection .
|
||||
-- $(J.mkParseJSON J.defaultOptions{J.sumEncoding=J.UntaggedValue} ''G.Value)
|
||||
|
||||
|
||||
-- $(J.deriveFromJSON J.defaultOptions{J.sumEncoding=J.UntaggedValue} ''G.Value)
|
||||
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.InterfaceTypeDefinition) where
|
||||
parseJSON = J.withObject "InterfaceTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
fields <- o .:? "fields"
|
||||
let flds = maybe [] (fmap fromIntrospection) fields
|
||||
desc' = fmap fromIntrospection desc
|
||||
when (kind /= "INTERFACE") $ kindErr kind "interface"
|
||||
let r = G.InterfaceTypeDefinition desc' name [] flds
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.UnionTypeDefinition) where
|
||||
parseJSON = J.withObject "UnionTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
possibleTypes <- o .: "possibleTypes"
|
||||
let memberTys = map (G.NamedType . G._otdName) $
|
||||
fmap fromIntrospection possibleTypes
|
||||
desc' = fmap fromIntrospection desc
|
||||
when (kind /= "UNION") $ kindErr kind "union"
|
||||
let r = G.UnionTypeDefinition desc' name [] memberTys
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.EnumTypeDefinition) where
|
||||
parseJSON = J.withObject "EnumTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
vals <- o .: "enumValues"
|
||||
when (kind /= "ENUM") $ kindErr kind "enum"
|
||||
let desc' = fmap fromIntrospection desc
|
||||
let r = G.EnumTypeDefinition desc' name [] (fmap fromIntrospection vals)
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.EnumValueDefinition) where
|
||||
parseJSON = J.withObject "EnumValueDefinition" $ \o -> do
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
let desc' = fmap fromIntrospection desc
|
||||
let r = G.EnumValueDefinition desc' name []
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.InputObjectTypeDefinition) where
|
||||
parseJSON = J.withObject "InputObjectTypeDefinition" $ \o -> do
|
||||
kind <- o .: "kind"
|
||||
name <- o .: "name"
|
||||
desc <- o .:? "description"
|
||||
mInputFields <- o .:? "inputFields"
|
||||
let inputFields = maybe [] (fmap fromIntrospection) mInputFields
|
||||
let desc' = fmap fromIntrospection desc
|
||||
when (kind /= "INPUT_OBJECT") $ kindErr kind "input_object"
|
||||
let r = G.InputObjectTypeDefinition desc' name [] inputFields
|
||||
return $ FromIntrospection r
|
||||
|
||||
instance J.FromJSON (FromIntrospection G.TypeDefinition) where
|
||||
parseJSON = J.withObject "TypeDefinition" $ \o -> do
|
||||
kind :: Text <- o .: "kind"
|
||||
r <- case kind of
|
||||
"SCALAR" ->
|
||||
G.TypeDefinitionScalar . fromIntrospection <$> J.parseJSON (J.Object o)
|
||||
"OBJECT" ->
|
||||
G.TypeDefinitionObject . fromIntrospection <$> J.parseJSON (J.Object o)
|
||||
"INTERFACE" ->
|
||||
G.TypeDefinitionInterface . fromIntrospection <$> J.parseJSON (J.Object o)
|
||||
"UNION" ->
|
||||
G.TypeDefinitionUnion . fromIntrospection <$> J.parseJSON (J.Object o)
|
||||
"ENUM" ->
|
||||
G.TypeDefinitionEnum . fromIntrospection <$> J.parseJSON (J.Object o)
|
||||
"INPUT_OBJECT" ->
|
||||
G.TypeDefinitionInputObject . fromIntrospection <$> J.parseJSON (J.Object o)
|
||||
_ -> pErr $ "unknown kind: " <> kind
|
||||
return $ FromIntrospection r
|
||||
|
||||
type IntrospectionResult = ( G.SchemaDocument
|
||||
, G.NamedType
|
||||
, Maybe G.NamedType
|
||||
, Maybe G.NamedType
|
||||
)
|
||||
|
||||
instance J.FromJSON (FromIntrospection IntrospectionResult) where
|
||||
parseJSON = J.withObject "SchemaDocument" $ \o -> do
|
||||
_data <- o .: "data"
|
||||
schema <- _data .: "__schema"
|
||||
-- the list of types
|
||||
types <- schema .: "types"
|
||||
-- query root
|
||||
queryType <- schema .: "queryType"
|
||||
queryRoot <- queryType .: "name"
|
||||
-- mutation root
|
||||
mMutationType <- schema .:? "mutationType"
|
||||
mutationRoot <- case mMutationType of
|
||||
Nothing -> return Nothing
|
||||
Just mutType -> do
|
||||
mutRoot <- mutType .: "name"
|
||||
return $ Just mutRoot
|
||||
-- subscription root
|
||||
mSubsType <- schema .:? "subscriptionType"
|
||||
subsRoot <- case mSubsType of
|
||||
Nothing -> return Nothing
|
||||
Just subsType -> do
|
||||
subRoot <- subsType .: "name"
|
||||
return $ Just subRoot
|
||||
let r = ( G.SchemaDocument (fmap fromIntrospection types)
|
||||
, queryRoot
|
||||
, mutationRoot
|
||||
, subsRoot
|
||||
)
|
||||
return $ FromIntrospection r
|
||||
|
||||
|
||||
getNamedTyp :: G.TypeDefinition -> G.Name
|
||||
getNamedTyp ty = case ty of
|
||||
G.TypeDefinitionScalar t -> G._stdName t
|
||||
G.TypeDefinitionObject t -> G._otdName t
|
||||
G.TypeDefinitionInterface t -> G._itdName t
|
||||
G.TypeDefinitionUnion t -> G._utdName t
|
||||
G.TypeDefinitionEnum t -> G._etdName t
|
||||
G.TypeDefinitionInputObject t -> G._iotdName t
|
@ -15,6 +15,7 @@ import qualified Data.HashMap.Strict as Map
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
|
||||
import Hasura.GraphQL.Resolve.Context
|
||||
import Hasura.GraphQL.Resolve.Introspect
|
||||
import Hasura.GraphQL.Schema
|
||||
|
@ -3,6 +3,7 @@
|
||||
{-# LANGUAGE MultiWayIf #-}
|
||||
{-# LANGUAGE NoImplicitPrelude #-}
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE TemplateHaskell #-}
|
||||
|
||||
module Hasura.GraphQL.Resolve.Context
|
||||
( FieldMap
|
||||
@ -30,11 +31,16 @@ module Hasura.GraphQL.Resolve.Context
|
||||
import Data.Has
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
import qualified Data.Aeson as J
|
||||
import qualified Data.Aeson.Casing as J
|
||||
import qualified Data.Aeson.TH as J
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Data.Sequence as Seq
|
||||
import qualified Database.PG.Query as Q
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.GraphQL.Resolve.ContextTypes
|
||||
|
||||
import Hasura.GraphQL.Utils
|
||||
import Hasura.GraphQL.Validate.Field
|
||||
@ -43,11 +49,18 @@ import Hasura.RQL.Types
|
||||
import Hasura.SQL.Types
|
||||
import Hasura.SQL.Value
|
||||
|
||||
import qualified Hasura.SQL.DML as S
|
||||
import qualified Hasura.SQL.DML as S
|
||||
|
||||
type FieldMap
|
||||
= Map.HashMap (G.NamedType, G.Name)
|
||||
(Either PGColInfo (RelInfo, Bool, AnnBoolExpSQL, Maybe Int))
|
||||
data InsResp
|
||||
= InsResp
|
||||
{ _irAffectedRows :: !Int
|
||||
, _irResponse :: !(Maybe J.Object)
|
||||
} deriving (Show, Eq)
|
||||
$(J.deriveJSON (J.aesonDrop 3 J.snakeCase) ''InsResp)
|
||||
|
||||
-- type FieldMap
|
||||
-- = Map.HashMap (G.NamedType, G.Name)
|
||||
-- (Either PGColInfo (RelInfo, Bool, AnnBoolExpSQL, Maybe Int))
|
||||
|
||||
-- data OrdTy
|
||||
-- = OAsc
|
||||
@ -61,28 +74,28 @@ type FieldMap
|
||||
|
||||
type RespTx = Q.TxE QErr BL.ByteString
|
||||
|
||||
-- order by context
|
||||
data OrdByItem
|
||||
= OBIPGCol !PGColInfo
|
||||
| OBIRel !RelInfo !AnnBoolExpSQL
|
||||
deriving (Show, Eq)
|
||||
-- -- order by context
|
||||
-- data OrdByItem
|
||||
-- = OBIPGCol !PGColInfo
|
||||
-- | OBIRel !RelInfo !AnnBoolExpSQL
|
||||
-- deriving (Show, Eq)
|
||||
|
||||
type OrdByItemMap = Map.HashMap G.Name OrdByItem
|
||||
-- type OrdByItemMap = Map.HashMap G.Name OrdByItem
|
||||
|
||||
type OrdByCtx = Map.HashMap G.NamedType OrdByItemMap
|
||||
-- type OrdByCtx = Map.HashMap G.NamedType OrdByItemMap
|
||||
|
||||
-- insert context
|
||||
type RelationInfoMap = Map.HashMap RelName RelInfo
|
||||
-- -- insert context
|
||||
-- type RelationInfoMap = Map.HashMap RelName RelInfo
|
||||
|
||||
data InsCtx
|
||||
= InsCtx
|
||||
{ icView :: !QualifiedTable
|
||||
, icColumns :: ![PGColInfo]
|
||||
, icSet :: !InsSetCols
|
||||
, icRelations :: !RelationInfoMap
|
||||
} deriving (Show, Eq)
|
||||
-- data InsCtx
|
||||
-- = InsCtx
|
||||
-- { icView :: !QualifiedTable
|
||||
-- , icColumns :: ![PGColInfo]
|
||||
-- , icSet :: !InsSetCols
|
||||
-- , icRelations :: !RelationInfoMap
|
||||
-- } deriving (Show, Eq)
|
||||
|
||||
type InsCtxMap = Map.HashMap QualifiedTable InsCtx
|
||||
-- type InsCtxMap = Map.HashMap QualifiedTable InsCtx
|
||||
|
||||
getFldInfo
|
||||
:: (MonadError QErr m, MonadReader r m, Has FieldMap r)
|
||||
|
39
server/src-lib/Hasura/GraphQL/Resolve/ContextTypes.hs
Normal file
@ -0,0 +1,39 @@
|
||||
module Hasura.GraphQL.Resolve.ContextTypes where
|
||||
|
||||
import Hasura.Prelude
|
||||
|
||||
import qualified Data.HashMap.Strict as Map
|
||||
import qualified Language.GraphQL.Draft.Syntax as G
|
||||
|
||||
import Hasura.RQL.Types.BoolExp
|
||||
import Hasura.RQL.Types.Common
|
||||
import Hasura.RQL.Types.SchemaCacheTypes
|
||||
import Hasura.SQL.Types
|
||||
|
||||
|
||||
type FieldMap
|
||||
= Map.HashMap (G.NamedType, G.Name)
|
||||
(Either PGColInfo (RelInfo, Bool, AnnBoolExpSQL, Maybe Int))
|
||||
|
||||
-- order by context
|
||||
data OrdByItem
|
||||
= OBIPGCol !PGColInfo
|
||||
| OBIRel !RelInfo !AnnBoolExpSQL
|
||||
deriving (Show, Eq)
|
||||
|
||||
type OrdByItemMap = Map.HashMap G.Name OrdByItem
|
||||
|
||||
type OrdByCtx = Map.HashMap G.NamedType OrdByItemMap
|
||||
|
||||
-- insert context
|
||||
type RelationInfoMap = Map.HashMap RelName RelInfo
|
||||
|
||||
data InsCtx
|
||||
= InsCtx
|
||||
{ icView :: !QualifiedTable
|
||||
, icColumns :: ![PGColInfo]
|
||||
, icSet :: !InsSetCols
|
||||
, icRelations :: !RelationInfoMap
|
||||
} deriving (Show, Eq)
|
||||
|
||||
type InsCtxMap = Map.HashMap QualifiedTable InsCtx
|
@ -79,7 +79,7 @@ withObject fn v = case v of
|
||||
AGObject nt (Just obj) -> fn nt obj
|
||||
AGObject nt Nothing ->
|
||||
throw500 $ "unexpected null for ty"
|
||||
<> G.showGT (G.TypeNamed nt)
|
||||
<> G.showGT (G.TypeNamed (G.Nullability True) nt)
|
||||
_ -> tyMismatch "object" v
|
||||
|
||||
asObject
|
||||
@ -107,7 +107,7 @@ withArray
|
||||
withArray fn v = case v of
|
||||
AGArray lt (Just l) -> fn lt l
|
||||
AGArray lt Nothing -> throw500 $ "unexpected null for ty"
|
||||
<> G.showGT (G.TypeList lt)
|
||||
<> G.showGT (G.TypeList (G.Nullability True) lt)
|
||||
_ -> tyMismatch "array" v
|
||||
|
||||
asArray
|
||||
|