mirror of
https://github.com/hasura/graphql-engine.git
synced 2024-10-05 14:28:08 +03:00
add community boilerplates and examples (#430)
This commit is contained in:
parent
be20a11d37
commit
acd62c3bf8
@ -3,6 +3,7 @@ LICENSE
|
||||
scripts/*
|
||||
assets/*
|
||||
docs/*
|
||||
community/*
|
||||
.circleci/*
|
||||
.ciignore
|
||||
.gitignore
|
||||
|
1
.gitignore
vendored
1
.gitignore
vendored
@ -1 +1,2 @@
|
||||
npm-debug.log
|
||||
*.temp
|
||||
|
@ -24,6 +24,19 @@ consisting of 3 components. Each have their own contributing guides:
|
||||
All of the three components have a single version, denoted by either the git
|
||||
tag, or a combination of branch name and git commit SHA.
|
||||
|
||||
### Docs
|
||||
|
||||
Contributing guide for docs can be found at [docs/CONTRIBUTING.md](docs/CONTRIBUTING.md).
|
||||
|
||||
### Community content
|
||||
|
||||
There is no specific contributing guide for community content. Anything that can
|
||||
help GraphQL Engine community/users can go into the section. We have identified
|
||||
[boilerplates](community/boilerplates), [examples](community/examples) and
|
||||
[tools](community/tools) as primary candidates. Feel free to submit a pull
|
||||
request if you have something to add (not necessarily belonging to the
|
||||
before-mentioned sections).
|
||||
|
||||
## Issues
|
||||
|
||||
### Reporting an Issue
|
||||
|
24
LICENSE-community
Normal file
24
LICENSE-community
Normal file
@ -0,0 +1,24 @@
|
||||
MIT Licence
|
||||
|
||||
Copyright (c) 2018-present, Hasura Technologies Private Limited
|
||||
|
||||
Permission is hereby granted, free of charge, to any person
|
||||
obtaining a copy of this software and associated documentation
|
||||
files (the "Software"), to deal in the Software without
|
||||
restriction, including without limitation the rights to use,
|
||||
copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the
|
||||
Software is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
|
||||
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
|
||||
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
|
||||
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
|
||||
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
|
||||
OTHER DEALINGS IN THE SOFTWARE.
|
90
README.md
90
README.md
@ -17,11 +17,15 @@ Read more at [hasura.io](https://hasura.io) and the [docs](https://docs.hasura.i
|
||||
------------------
|
||||
|
||||
![Hasura GraphQL Enigne Demo](assets/demo.gif)
|
||||
---
|
||||
|
||||
------------------
|
||||
|
||||
![Hasura GraphQL Engine Readltime Demo](assets/realtime.gif)
|
||||
|
||||
-------------------
|
||||
|
||||
## Features
|
||||
|
||||
* **Make powerful queries**: Built-in filtering, pagination, pattern search, bulk insert, update, delete mutations
|
||||
* **Realtime**: Convert any GraphQL query to a live query by using subscriptions
|
||||
* **Trigger webhooks or serverless functions**: On Postgres insert/update/delete events ([read more](event-triggers.md))
|
||||
@ -33,12 +37,26 @@ Read more at [hasura.io](https://hasura.io) and the [docs](https://docs.hasura.i
|
||||
|
||||
Read more at: [https://hasura.io](https://hasura.io) and the [docs](https://docs.hasura.io).
|
||||
|
||||
## Demos
|
||||
## Table of contents
|
||||
<!-- markdown-toc start - Don't edit this section. Run M-x markdown-toc-refresh-toc -->
|
||||
**Table of Contents**
|
||||
|
||||
* [Add GraphQL to a self-hosted GitLab instance](https://www.youtube.com/watch?v=a2AhxKqd82Q) (*3:44 mins*)
|
||||
* [Todo app with Auth0 and GraphQL backend](https://www.youtube.com/watch?v=15ITBYnccgc) (*4:00 mins*)
|
||||
* [GraphQL on GitLab integrated with GitLab auth](https://www.youtube.com/watch?v=m1ChRhRLq7o) (*4:05 mins*)
|
||||
* [Dashboard for 10million rides with geo-location (PostGIS, Timescale)](https://www.youtube.com/watch?v=tsY573yyGWA) (*3:06 mins*)
|
||||
- [Quickstart:](#quickstart)
|
||||
- [One-click deployment on Heroku](#one-click-deployment-on-heroku)
|
||||
- [Other deployment methods](#other-deployment-methods)
|
||||
- [Architecture](#architecture)
|
||||
- [Client-side tooling](#client-side-tooling)
|
||||
- [Add business logic](#add-business-logic)
|
||||
- [Custom resolvers](#custom-resolvers)
|
||||
- [Trigger webhooks on database events](#trigger-webhooks-on-database-events)
|
||||
- [Demos](#demos)
|
||||
- [Realtime applications](#realtime-applications)
|
||||
- [Videos](#videos)
|
||||
- [Support & Troubleshooting](#support--troubleshooting)
|
||||
- [Contributing](#contributing)
|
||||
- [License](#license)
|
||||
|
||||
<!-- markdown-toc end -->
|
||||
|
||||
## Quickstart:
|
||||
|
||||
@ -74,6 +92,51 @@ You can also place the engine behind a central GraphQL proxy that fronts multipl
|
||||
|
||||
Hasura works with any GraphQL client. We recommend using [Apollo Client](https://github.com/apollographql/apollo-client). See [awesome-graphql](https://github.com/chentsulin/awesome-graphql) for a list of clients.
|
||||
|
||||
## Add business logic
|
||||
|
||||
### Custom resolvers
|
||||
|
||||
Add custom resolvers in addition to Hasura GraphQL engine. Ideal for delegating
|
||||
to HTTP APIs, making direct calls to another data-source or writing business
|
||||
logic in code - [read more](community/boilerplates/custom-resolvers).
|
||||
|
||||
### Trigger webhooks on database events
|
||||
|
||||
Add asynchronous business logic that is triggered based on database events.
|
||||
Ideal for notifications, data-pipelines from Postgres or asynchronous
|
||||
processing - [read more](event-triggers.md).
|
||||
|
||||
## Demos
|
||||
|
||||
Checkout all the example applications in the
|
||||
[community/examples](community/examples) directory.
|
||||
|
||||
### Realtime applications
|
||||
|
||||
- Group Chat application built with React, includes a typing indicator, online users & new
|
||||
message notifications.
|
||||
- [Try it out](https://chat-example-trial-roar.herokuapp.com/)
|
||||
- [Tutorial](community/examples/realtime-chat)
|
||||
- [Browse APIs](https://hasura-realtime-group-chat.herokuapp.com/)
|
||||
|
||||
- Live location tracking app that shows a running vehicle changing current GPS
|
||||
coordinates moving on a map.
|
||||
- [Try it out](https://hasura.github.io/realtime-location-app/)
|
||||
- [Tutorial](community/examples/realtime-location-tracking)
|
||||
- [Browse APIs](https://realtime-backend.herokuapp.com/)
|
||||
|
||||
- A realtime dashboard for data aggregations on continuously changing data.
|
||||
- [Try it out](https://shahidh.in/hasura-realtime-poll/)
|
||||
- [Tutorial](community/examples/realtime-poll)
|
||||
- [Browse APIs](https://hasura-realtime-poll.herokuapp.com/)
|
||||
|
||||
### Videos
|
||||
|
||||
* [Add GraphQL to a self-hosted GitLab instance](https://www.youtube.com/watch?v=a2AhxKqd82Q) (*3:44 mins*)
|
||||
* [Todo app with Auth0 and GraphQL backend](https://www.youtube.com/watch?v=15ITBYnccgc) (*4:00 mins*)
|
||||
* [GraphQL on GitLab integrated with GitLab auth](https://www.youtube.com/watch?v=m1ChRhRLq7o) (*4:05 mins*)
|
||||
* [Dashboard for 10million rides with geo-location (PostGIS, Timescale)](https://www.youtube.com/watch?v=tsY573yyGWA) (*3:06 mins*)
|
||||
|
||||
|
||||
## Support & Troubleshooting
|
||||
|
||||
@ -92,6 +155,17 @@ Check out our [contributing guide](CONTRIBUTING.md) for more details.
|
||||
|
||||
## License
|
||||
|
||||
GraphQL Engine is available under the [GNU Affero General Public License v3](https://www.gnu.org/licenses/agpl-3.0.en.html) (AGPL-3.0), the same license as [MongoDB](https://www.mongodb.com/community/licensing). We have written more about what you can and cannot do under AGPL [here](https://github.com/hasura/graphql-engine/wiki/License-Explained).
|
||||
The core GraphQL Engine is available under the [GNU Affero General Public
|
||||
License v3](https://www.gnu.org/licenses/agpl-3.0.en.html) (AGPL-3.0), the same
|
||||
license as [MongoDB](https://www.mongodb.com/community/licensing). We have
|
||||
written more about what you can and cannot do under AGPL
|
||||
[here](https://github.com/hasura/graphql-engine/wiki/License-Explained).
|
||||
|
||||
**Commercial licenses** that bundle the Hasura GraphQL Engine with support, and SLAs are available on request. Please feel free to contact us at build@hasura.io or on our [website chat](https://hasura.io).
|
||||
**Commercial licenses** that bundle the Hasura GraphQL Engine with support, and
|
||||
SLAs are available on request. Please feel free to contact us at build@hasura.io
|
||||
or on our [website chat](https://hasura.io).
|
||||
|
||||
All **other contents** (except those in [`server`](server), [`cli`](cli) and
|
||||
[`console`](console) directories) are under [MIT License](LICENSE-community).
|
||||
This includes everything in the [`docs`](docs) and [`community`](community)
|
||||
directories.
|
||||
|
16
community/README.md
Normal file
16
community/README.md
Normal file
@ -0,0 +1,16 @@
|
||||
# Community
|
||||
|
||||
This directory contains community contributed code and content that supplements
|
||||
Hasura GraphQL Engine. It includes several example applications built using
|
||||
GraphQL Engine to demonstrate it's features, several boilerplates for users to get
|
||||
started with auth webhooks, triggers etc. and some community tooling around the Engine.
|
||||
|
||||
Feel free to open pull requests to add more content here.
|
||||
|
||||
- [Boilerplates](boilerplates)
|
||||
- [Examples](examples)
|
||||
- [Tools](tools)
|
||||
|
||||
## License
|
||||
|
||||
All contents of this directory are under [MIT License](../LICENSE-community).
|
1
community/boilerplates/auth-webhooks/.gitignore
vendored
Normal file
1
community/boilerplates/auth-webhooks/.gitignore
vendored
Normal file
@ -0,0 +1 @@
|
||||
*.temp
|
12
community/boilerplates/auth-webhooks/README.md
Normal file
12
community/boilerplates/auth-webhooks/README.md
Normal file
@ -0,0 +1,12 @@
|
||||
# Boilerplates for auth webhooks
|
||||
|
||||
This directory is a compilation of code samples (boilerplates) that can be used to implement custom auth webhooks for the Hasura GraphQL engine.
|
||||
|
||||
## Contributing
|
||||
|
||||
- Please fork this repo and submit a PR with your changes
|
||||
- Please create a directory name, lowercase, hyphenated
|
||||
- Please add a helpful README that has the following sections:
|
||||
- Overview
|
||||
- Deployment instructions
|
||||
- Creation of any specific environment variables or secrets
|
@ -0,0 +1,96 @@
|
||||
# Sample Firebase Cloud Function Auth Webhook for Hasura GraphQL engine
|
||||
|
||||
Further reading: [Firebase SDK for Cloud Functions](https://firebase.google.com/docs/functions/)
|
||||
|
||||
|
||||
## Install and deploy
|
||||
|
||||
1. Create a Firebase Project using the [Firebase Console](https://console.firebase.google.com).
|
||||
1. Clone or download this repo and go to `cd firebase-cloud-functions` directory.
|
||||
1. You must have the Firebase CLI installed. If you don't have it install it with `npm install -g firebase-tools` and then configure it with `firebase login`.
|
||||
1. Configure the CLI locally by using `firebase init` and select your project in the list.
|
||||
1. Follow [Add the Firebase Admin SDK to Your Server](https://firebase.google.com/docs/admin/setup) and save it as config.js
|
||||
1. Copy index.js and config.js to functions folder by: `cp index.js config.js functions/`
|
||||
1. Install Cloud Functions dependencies locally by running: `cd functions; npm install`
|
||||
1. Deploy to Firebase Cloud Functions by `firebase deploy`
|
||||
|
||||
Once deployed endpoint like this will be created and displayed:
|
||||
|
||||
```bash
|
||||
https://us-central1-xxxxx-auth.cloudfunctions.net/hasuraWebhook
|
||||
```
|
||||
|
||||
## Add webhook endpoint to Hasura GraphQL
|
||||
|
||||
Set `--auth-hook` or `HASURA_GRAPHQL_AUTH_HOOK` to the endpoint obtained above.
|
||||
|
||||
[GraphQL engine server flags reference](https://docs.hasura.io/1.0/graphql/manual/deployment/graphql-engine-flags/reference.html) for details.
|
||||
|
||||
## Create table and set permission
|
||||
|
||||
Follow [Common roles and auth examples](https://docs.hasura.io/1.0/graphql/manual/auth/common-roles-auth-examples.html)
|
||||
on Hasura doc for details of how to setup permission to a table.
|
||||
|
||||
Make sure to change id column of user table to TXT type as uid sent from webhook is firebase User UID format (e.g. 0LnvZc7405TjRTbjURhZYYVXPI52)
|
||||
|
||||
## How to call webhook from frontend JS code (React, VueJS, Angular etc...)
|
||||
|
||||
postAxios.js
|
||||
```bash
|
||||
import axiosBase from 'axios'
|
||||
import * as firebase from 'firebase'
|
||||
|
||||
const getIdToken = async () => {
|
||||
return new Promise((resolve, reject) => {
|
||||
firebase.auth().onAuthStateChanged(function (user) {
|
||||
if (user) {
|
||||
resolve(firebase.auth().currentUser.getIdToken())
|
||||
} else {
|
||||
reject(Error('user logged out'))
|
||||
}
|
||||
})
|
||||
})
|
||||
}
|
||||
|
||||
export const postAxios = async (queryString) => {
|
||||
const idToken = await getIdToken()
|
||||
|
||||
const axios = axiosBase.create({
|
||||
baseURL: 'https://YOURHASURADOMAIN/v1alpha1/graphql',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'Authorization': 'Bearer ' + idToken
|
||||
},
|
||||
responseType: 'json',
|
||||
method: 'post'
|
||||
})
|
||||
|
||||
return await axios({
|
||||
data: {
|
||||
query: queryString
|
||||
}
|
||||
}).catch(({response: r}) => console.log(r))
|
||||
}
|
||||
```
|
||||
|
||||
userService.js
|
||||
```bash
|
||||
import { postAxios } from './postAxios'
|
||||
|
||||
export default {
|
||||
async getUsers () {
|
||||
const queryString = `
|
||||
query {
|
||||
user
|
||||
{
|
||||
id
|
||||
name
|
||||
}
|
||||
}
|
||||
`
|
||||
|
||||
const result = await postAxios(queryString)
|
||||
return result.data.data.user
|
||||
}
|
||||
}
|
||||
```
|
@ -0,0 +1,55 @@
|
||||
const functions = require('firebase-functions');
|
||||
|
||||
var admin = require('firebase-admin');
|
||||
var serviceAccount = require('./config.json');
|
||||
|
||||
exports.hasuraWebhook = functions.https.onRequest((request, response) => {
|
||||
var error = null;
|
||||
|
||||
if (serviceAccount) {
|
||||
try {
|
||||
admin.initializeApp({
|
||||
credential: admin.credential.cert(serviceAccount)
|
||||
});
|
||||
} catch (e) {
|
||||
error = e;
|
||||
}
|
||||
}
|
||||
|
||||
var authHeaders = request.get('Authorization');
|
||||
// Send anonymous role if there are no auth headers
|
||||
if (!authHeaders) {
|
||||
response.json({'x-hasura-role': 'anonymous'});
|
||||
return;
|
||||
} else {
|
||||
// Validate the received id_tokenolp;
|
||||
var idToken = extractToken(authHeaders);
|
||||
console.log(idToken);
|
||||
admin.auth().verifyIdToken(idToken)
|
||||
.then((decodedToken) => {
|
||||
console.log('decodedToken', decodedToken);
|
||||
var hasuraVariables = {
|
||||
'X-Hasura-User-Id': decodedToken.uid,
|
||||
'X-Hasura-Role': 'user'
|
||||
};
|
||||
console.log(hasuraVariables); // For debug
|
||||
// Send appropriate variables
|
||||
response.json(hasuraVariables);
|
||||
return;
|
||||
})
|
||||
.catch((e) => {
|
||||
// Throw authentication error
|
||||
console.log(e);
|
||||
response.json({'x-hasura-role': 'anonymous'});
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
const extractToken = (bearerToken) => {
|
||||
const regex = /^(Bearer) (.*)$/g;
|
||||
const match = regex.exec(bearerToken);
|
||||
if (match && match[2]) {
|
||||
return match[2];
|
||||
}
|
||||
return null;
|
||||
}
|
1
community/boilerplates/auth-webhooks/nodejs-express/.gitignore
vendored
Normal file
1
community/boilerplates/auth-webhooks/nodejs-express/.gitignore
vendored
Normal file
@ -0,0 +1 @@
|
||||
*node_modules
|
@ -0,0 +1 @@
|
||||
web: node server.js
|
123
community/boilerplates/auth-webhooks/nodejs-express/README.md
Normal file
123
community/boilerplates/auth-webhooks/nodejs-express/README.md
Normal file
@ -0,0 +1,123 @@
|
||||
# Sample Auth Webhook for Hasura GraphQL engine
|
||||
|
||||
This is a sample auth webhook for authenticating requests to the Hasura GraphQL engine.
|
||||
|
||||
It has boilerplate code written for auth0 and firebase auth. There is also a generic sample handler in `server.js` where you can handle your custom auth providers.
|
||||
|
||||
## Quick deploy
|
||||
|
||||
<!--
|
||||
### Deploy with Heroku (recommended)
|
||||
|
||||
1. Click the following button for deploying to Heroku.
|
||||
|
||||
[![Deploy](https://www.herokucdn.com/deploy/button.svg)](https://heroku.com/deploy?template=https://github.com/hasura/sample-auth-webhook)
|
||||
|
||||
2. Once it is deployed, go to `Manage App > Settings` of your app and set the following environment variables if you want to use the associated providers.
|
||||
|
||||
- **AUTH_ZERO_DOMAIN**: Example `test.auth0.com`
|
||||
- **FIREBASE_CONFIG**: Copy the contents of your serviceAccount JSON file for this field. Example:
|
||||
```
|
||||
{
|
||||
"type": "service_account",
|
||||
"project_id": "testapp-2222",
|
||||
"private_key_id": "f02aca08952f702de43ed577b428f405efe2d377",
|
||||
"private_key": "-----BEGIN PRIVATE KEY-----\n<your-private-key>\n-----END PRIVATE KEY-----\n",
|
||||
"client_email": "firebase-adminsdk-t4sik@testapp-24a60.iam.gserviceaccount.com",
|
||||
"client_id": "113608616484852272199",
|
||||
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
|
||||
"token_uri": "https://accounts.google.com/o/oauth2/token",
|
||||
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
|
||||
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-t4sik%40testapp-22222.iam.gserviceaccount.com"
|
||||
}
|
||||
```
|
||||
|
||||
If you are not using an auth provider, you need not enter the environment variable associated with it
|
||||
-->
|
||||
|
||||
### Deploy using [Now](https://zeit.co/now)
|
||||
|
||||
Run the following commands to deploy using Now.
|
||||
|
||||
```bash
|
||||
git clone https://github.com/hasura/graphql-engine
|
||||
cd graphql-engine/community/boilerplates/auth-webhooks/nodejs-express
|
||||
npm install -g now
|
||||
now -e \
|
||||
AUTH_ZERO_DOMAIN='test.auth0.com' -e \
|
||||
FIREBASE_CONFIG='{
|
||||
"type": "service_account",
|
||||
"project_id": "testapp-2222",
|
||||
"private_key_id": "f02aca08952f702de43ed577b428f405efe2d377",
|
||||
"private_key": "-----BEGIN PRIVATE KEY-----\n<your-private-key>\n-----END PRIVATE KEY-----\n",
|
||||
"client_email": "firebase-adminsdk-t4sik@testapp-24a60.iam.gserviceaccount.com",
|
||||
"client_id": "113608616484852272199",
|
||||
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
|
||||
"token_uri": "https://accounts.google.com/o/oauth2/token",
|
||||
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
|
||||
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-t4sik%40testapp-22222.iam.gserviceaccount.com"
|
||||
}'
|
||||
```
|
||||
|
||||
If you are not using an auth provider, you need not enter the environment variable associated with it. For example, if you are not using firebase, the command last command you should run is,
|
||||
|
||||
```bash
|
||||
$ now -e \
|
||||
AUTH_ZERO_DOMAIN='test.auth0.com'
|
||||
```
|
||||
|
||||
### Deploy with Glitch
|
||||
|
||||
1. Click the following button to edit on glitch
|
||||
|
||||
[![glitch-deploy-button](assets/deploy-glitch.png)](http://glitch.com/edit/#!/import/github/hasura/graphql-engine/community/boilerplates/auth-webhooks/nodejs-express)
|
||||
|
||||
2. Add the following environment variables in the `.env` file on glitch.
|
||||
|
||||
```env
|
||||
AUTH_ZERO_DOMAIN='test.auth0.com'
|
||||
FIREBASE_CONFIG='{
|
||||
"type": "service_account",
|
||||
"project_id": "testapp-2222",
|
||||
"private_key_id": "f02aca08952f702de43ed577b428f405efe2d377",
|
||||
"private_key": "-----BEGIN PRIVATE KEY-----\n<your-private-key>\n-----END PRIVATE KEY-----\n",
|
||||
"client_email": "firebase-adminsdk-t4sik@testapp-24a60.iam.gserviceaccount.com",
|
||||
"client_id": "113608616484852272199",
|
||||
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
|
||||
"token_uri": "https://accounts.google.com/o/oauth2/token",
|
||||
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
|
||||
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/firebase-adminsdk-t4sik%40testapp-22222.iam.gserviceaccount.com"
|
||||
}'
|
||||
```
|
||||
|
||||
If you are not using an auth provider, you need not enter the environment variable associated with it. For example, if you are not using firebase, the command last command you should run is,
|
||||
|
||||
```env
|
||||
AUTH_ZERO_DOMAIN='test.auth0.com'
|
||||
```
|
||||
|
||||
## Usage with Hasura GraphQL engine
|
||||
|
||||
Once you have deployed this webhook, you can use it along with the GraphQL engine. You have to set the webhook URL as an environment variable in the docker container that runs the GraphQL engine.
|
||||
|
||||
*[Read the docs](https://docs.hasura.io/1.0/graphql/manual/auth/webhook.html).*
|
||||
|
||||
### Auth0
|
||||
|
||||
Send the auth0 `access_token` as a header while making queries to the `graphql-engine`.
|
||||
|
||||
```JSON
|
||||
{
|
||||
"Authorization": "Bearer <access_token>"
|
||||
}
|
||||
```
|
||||
|
||||
### Firebase
|
||||
|
||||
Send the firebase `id_token` as a header while making queries to the `graphql-engine`.
|
||||
|
||||
```
|
||||
{
|
||||
"Authorization": "Bearer <id_token>"
|
||||
}
|
||||
```
|
13
community/boilerplates/auth-webhooks/nodejs-express/app.json
Normal file
13
community/boilerplates/auth-webhooks/nodejs-express/app.json
Normal file
@ -0,0 +1,13 @@
|
||||
{
|
||||
"name": "Hasura GraphQL Engine Auth Webhook",
|
||||
"description": "Hasura GraphQL Engine Auth Webhook <> heroku",
|
||||
"keywords": [
|
||||
"graphql",
|
||||
"heroku",
|
||||
"postgres",
|
||||
"hasura",
|
||||
"auth",
|
||||
"webhook"
|
||||
],
|
||||
"repository": "https://github.com/hasura/sample-auth-webhook",
|
||||
}
|
Binary file not shown.
After Width: | Height: | Size: 8.4 KiB |
@ -0,0 +1,60 @@
|
||||
var express = require('express');
|
||||
var auth0Router = express.Router();
|
||||
var requestClient = require('request');
|
||||
var auth0Domain = process.env.AUTH_ZERO_DOMAIN;
|
||||
/*
|
||||
Auth webhook handler for auth0
|
||||
Flow:
|
||||
1) Expects access_token to be sent as 'Authorization: Bearer <access-token>
|
||||
2) Verified access_token by fetching /userinfo endpoint from auth0
|
||||
|
||||
Usage:
|
||||
1) From your application, when you call Hasura's GraphQL APIs remember to send the access_token from auth0 as an authorization header
|
||||
2) Replace the url (https://test-hasura.auth0.com/userinfo) in the code below with your own auth0 app url
|
||||
*/
|
||||
|
||||
auth0Router.route('/webhook').get((request, response) => {
|
||||
// Throw 500 if auth0 domain is not configured
|
||||
if (!auth0Domain) {
|
||||
response.status(500).send('Auth0 domain not configured');
|
||||
return;
|
||||
}
|
||||
|
||||
var token = request.get('Authorization');
|
||||
|
||||
if (!token) {
|
||||
response.json({'x-hasura-role': 'anonymous'});
|
||||
return;
|
||||
} else {
|
||||
// Fetch information about this user from
|
||||
// auth0 to validate this token
|
||||
// NOTE: Replace the URL with your own auth0 app url
|
||||
var options = {
|
||||
url: `https://${auth0Domain}/userinfo`,
|
||||
headers: {
|
||||
Authorization: token,
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
};
|
||||
|
||||
requestClient(options, (err, res, body) => {
|
||||
if (!err && res.statusCode == 200) {
|
||||
var userInfo = JSON.parse(body);
|
||||
console.log(userInfo); //debug
|
||||
var hasuraVariables = {
|
||||
'X-Hasura-User-Id': userInfo.sub,
|
||||
'X-Hasura-Role': 'user'
|
||||
};
|
||||
console.log(hasuraVariables); // For debug
|
||||
response.json(hasuraVariables);
|
||||
} else {
|
||||
// Error response from auth0
|
||||
console.log(err, res, body);
|
||||
response.json({'x-hasura-role': 'anonymous'});
|
||||
return;
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = auth0Router;
|
@ -0,0 +1,3 @@
|
||||
const config = process.env.FIREBASE_CONFIG;
|
||||
|
||||
module.exports = config;
|
@ -0,0 +1,65 @@
|
||||
var express = require('express');
|
||||
var firebaseRouter = express.Router();
|
||||
var admin = require('firebase-admin');
|
||||
var serviceAccount = require('./config.js');
|
||||
var error = null;
|
||||
// Initialize the Firebase admin SDK with your service account credentials
|
||||
if (serviceAccount) {
|
||||
try {
|
||||
admin.initializeApp({
|
||||
credential: admin.credential.cert(JSON.parse(serviceAccount))
|
||||
});
|
||||
} catch (e) {
|
||||
error = e;
|
||||
}
|
||||
}
|
||||
|
||||
firebaseRouter.route("/webhook").get((request, response) => {
|
||||
// Throw 500 if firebase is not configured
|
||||
if (!serviceAccount) {
|
||||
response.status(500).send('Firebase not configured');
|
||||
return;
|
||||
}
|
||||
// Check for errors initializing firebase SDK
|
||||
if (error) {
|
||||
response.status(500).send('Invalid firebase configuration');
|
||||
return;
|
||||
}
|
||||
// Get authorization headers
|
||||
var authHeaders = request.get('Authorization');
|
||||
// Send anonymous role if there are no auth headers
|
||||
if (!authHeaders) {
|
||||
response.json({'x-hasura-role': 'anonymous'});
|
||||
return;
|
||||
} else {
|
||||
// Validate the received id_token
|
||||
var idToken = extractToken(authHeaders);
|
||||
console.log(idToken);
|
||||
admin.auth().verifyIdToken(idToken)
|
||||
.then((decodedToken) => {
|
||||
var hasuraVariables = {
|
||||
'X-Hasura-User-Id': decodedToken.uid,
|
||||
'X-Hasura-Role': 'user'
|
||||
};
|
||||
console.log(hasuraVariables); // For debug
|
||||
// Send appropriate variables
|
||||
response.json(hasuraVariables);
|
||||
})
|
||||
.catch((e) => {
|
||||
// Throw authentication error
|
||||
console.log(e);
|
||||
response.json({'x-hasura-role': 'anonymous'});
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
const extractToken = (bearerToken) => {
|
||||
const regex = /^(Bearer) (.*)$/g;
|
||||
const match = regex.exec(bearerToken);
|
||||
if (match && match[2]) {
|
||||
return match[2];
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
module.exports = firebaseRouter;
|
3595
community/boilerplates/auth-webhooks/nodejs-express/package-lock.json
generated
Normal file
3595
community/boilerplates/auth-webhooks/nodejs-express/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,17 @@
|
||||
{
|
||||
"name": "sample-auth-webhook",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"main": "server.js",
|
||||
"scripts": {
|
||||
"start": "node server.js",
|
||||
"test": "echo \"Error: no test specified\" && exit 1"
|
||||
},
|
||||
"author": "wawhal",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"express": "^4.16.3",
|
||||
"firebase-admin": "^5.12.1",
|
||||
"request": "^2.87.0"
|
||||
}
|
||||
}
|
@ -0,0 +1,54 @@
|
||||
// Sample webhook showing what a hasura auth webhook looks like
|
||||
|
||||
// init project
|
||||
var express = require('express');
|
||||
var app = express();
|
||||
var requestClient = require('request');
|
||||
var port = process.env.PORT || 3000;
|
||||
|
||||
/* A simple sample
|
||||
Flow:
|
||||
1) Extracts token
|
||||
2) Fetches userInfo in a mock function
|
||||
3) Return hasura variables
|
||||
*/
|
||||
function fetchUserInfo (token, cb) {
|
||||
// This function takes a token and then makes an async
|
||||
// call to the session-cache or database to fetch
|
||||
// data that is needed for Hasura's access control rules
|
||||
cb();
|
||||
}
|
||||
app.get('/', (req, res) => {
|
||||
res.send('Webhooks are running');
|
||||
});
|
||||
|
||||
app.get('/simple/webhook', (request, response) => {
|
||||
// Extract token from request
|
||||
var token = request.get('Authorization');
|
||||
|
||||
// Fetch user_id that is associated with this token
|
||||
fetchUserInfo(token, (result) => {
|
||||
|
||||
// Return appropriate response to Hasura
|
||||
var hasuraVariables = {
|
||||
'X-Hasura-Role': 'user', // result.role
|
||||
'X-Hasura-User-Id': '1' // result.user_id
|
||||
};
|
||||
response.json(hasuraVariables);
|
||||
});
|
||||
});
|
||||
|
||||
// Firebase handler
|
||||
var firebaseRouter = require('./firebase/firebaseHandler');
|
||||
app.use('/firebase', firebaseRouter);
|
||||
|
||||
// Auth0 handler
|
||||
var auth0Router = require('./auth0/auth0Handler');
|
||||
app.use('/auth0', auth0Router);
|
||||
|
||||
|
||||
|
||||
// listen for requests :)
|
||||
var listener = app.listen(port, function () {
|
||||
console.log('Your app is listening on port ' + port);
|
||||
});
|
5
community/boilerplates/custom-resolvers/.gitignore
vendored
Normal file
5
community/boilerplates/custom-resolvers/.gitignore
vendored
Normal file
@ -0,0 +1,5 @@
|
||||
node_modules
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
11
community/boilerplates/custom-resolvers/Dockerfile
Normal file
11
community/boilerplates/custom-resolvers/Dockerfile
Normal file
@ -0,0 +1,11 @@
|
||||
FROM node:alpine
|
||||
|
||||
WORKDIR /server
|
||||
|
||||
COPY ./package.json /server/
|
||||
|
||||
RUN npm install
|
||||
|
||||
COPY . /server/
|
||||
|
||||
CMD ["npm", "start"]
|
1
community/boilerplates/custom-resolvers/Procfile
Normal file
1
community/boilerplates/custom-resolvers/Procfile
Normal file
@ -0,0 +1 @@
|
||||
web: npm start
|
195
community/boilerplates/custom-resolvers/README.md
Normal file
195
community/boilerplates/custom-resolvers/README.md
Normal file
@ -0,0 +1,195 @@
|
||||
# GraphQL Custom Resolver Example
|
||||
|
||||
This is a simple example of using a custom resolver with Hasura's GraphQL API.
|
||||
|
||||
## Motivation
|
||||
|
||||
Hasura GraphQL Engine provides instant GraphQL APIs over the tables and views of
|
||||
any Postgres database. It also comes with a fine grained access control layer
|
||||
that helps you restrict the data that can be consumed.
|
||||
|
||||
However, sometimes you might have to write custom resolvers to capture business
|
||||
logic that is unrelated to the database or needs to execute a custom transaction
|
||||
or write to the database.
|
||||
|
||||
In this example, we illustrate how to write custom resolvers and merge them with
|
||||
the Hasura GraphQL Engine. We combine Hasura GraphQL Engine's GraphQL API
|
||||
running at `https://bazookaand.herokuapp.com/v1alpha1/graphql` with the
|
||||
following custom resolvers:
|
||||
|
||||
1. A `hello` query
|
||||
2. A `count` query (that returns a counter from another data source )
|
||||
3. A `increment_counter` mutation that increments the value of `count`.
|
||||
4. A `user_average_age` query that makes directly makes an SQL query to Postgres
|
||||
using knex.
|
||||
|
||||
You can use this as a boilerplate to write custom resolvers with Hasura GraphQL
|
||||
Engine.
|
||||
|
||||
![Custom resolvers with Hasura GraphQL engine](./assets/custom-resolvers-diagram.png)
|
||||
|
||||
## Usage
|
||||
|
||||
1. Install the required dependencies.
|
||||
|
||||
```bash
|
||||
npm install
|
||||
```
|
||||
|
||||
2. Set appropriate environment variables for the GraphQL Engine URL, the access
|
||||
key to GraphQL Engine and the Postgres connection string.
|
||||
|
||||
|
||||
```bash
|
||||
# without the /v1apha1/graphql part
|
||||
export HASURA_GRAPHQL_ENGINE_URL='https://hge.herokuapp.com'
|
||||
export X_HASURA_ACCESS_KEY='<access_key>'
|
||||
|
||||
# Only required for the direct SQL resolver
|
||||
export PG_CONNECTION_STRING='<postgres-connection-string>'
|
||||
```
|
||||
|
||||
3. Run the server
|
||||
|
||||
```bash
|
||||
npm start
|
||||
```
|
||||
|
||||
## Deployment
|
||||
|
||||
You can deploy this sample boilerplate with:
|
||||
|
||||
* Now
|
||||
* Docker
|
||||
|
||||
### Deploy using [Now](https://zeit.co/now)
|
||||
|
||||
Run these commands to instantly deploy this boilerplate using Now.
|
||||
|
||||
```bash
|
||||
git clone https://github.com/hasura/graphql-engine
|
||||
cd community/boilerplates/custom-resolvers
|
||||
now -e \
|
||||
HASURA_GRAPHQL_ENGINE_URL='https://hge.herokuapp.com' -e \
|
||||
X_HASURA_ACCESS_KEY='<access_key>' --npm
|
||||
```
|
||||
|
||||
### Deploy the docker image
|
||||
|
||||
This project comes with a [`Dockerfile`](Dockerfile).
|
||||
|
||||
## Implementation Details
|
||||
|
||||
We will use Apollo's `graphql-tools` library to make a working GraphQL Schema
|
||||
out of our custom resolvers. Finally, we will merge these resolvers with the
|
||||
existing Hasura schema so that it can be queried under the same endpoint.
|
||||
|
||||
### Writing type definitions
|
||||
|
||||
The type definitions are written in standard GraphQL format. We need the
|
||||
following queries in our custom logic:
|
||||
|
||||
|
||||
```graphql
|
||||
type Query {
|
||||
# field hello will return "Hello World" which is a string
|
||||
hello: String,
|
||||
|
||||
# field count will return an Int
|
||||
count: Int,
|
||||
|
||||
# field user_average_age will return a Float
|
||||
user_average_age: Float
|
||||
}
|
||||
|
||||
type Mutation {
|
||||
# field "increment_counter" will increment the counter and return type IncrementCounter
|
||||
increment_counter: IncrementCounter,
|
||||
|
||||
# IncrementCounter simply returns the new value of the counter
|
||||
new_count: Int
|
||||
}
|
||||
```
|
||||
|
||||
### Writing resolvers
|
||||
|
||||
Every resolver is a function that is executed with the following arguments in
|
||||
the order below:
|
||||
|
||||
1. `root`: The root of the current field
|
||||
2. `args`: The arguments provided in the query
|
||||
3. `context`: The server context, which also consists of headers
|
||||
4. `info`: The AST document related to the query made
|
||||
|
||||
The resolvers in our case are:
|
||||
|
||||
```js
|
||||
const resolvers = {
|
||||
// resolvers for queries
|
||||
Query: {
|
||||
hello: (root, args, context, info) => {
|
||||
// return response
|
||||
return 'Hello world!';
|
||||
},
|
||||
count: (root, args, context, info) => {
|
||||
// return response
|
||||
return count;
|
||||
},
|
||||
user_average_age: async (root, args, context, info) => {
|
||||
// make SQL query using knex client
|
||||
const response = await knexClient('user')
|
||||
.avg('age');
|
||||
// return response
|
||||
return response[0].avg;
|
||||
}
|
||||
},
|
||||
|
||||
// resolvers for mutations
|
||||
Mutation: {
|
||||
increment_counter: (root, args, context, info) => {
|
||||
// return response
|
||||
return { new_count: ++count };
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Making a new schema out of these custom resolvers
|
||||
|
||||
Use `makeExecutableSchema()` function from the `graphql-tools` library to make a
|
||||
schema out of the type definitions and resolvers above.
|
||||
|
||||
```js
|
||||
import { makeExecutableSchema } from 'graphql-tools';
|
||||
|
||||
const executableCustomSchema = makeExecutableSchema({
|
||||
typeDefs,
|
||||
resolvers,
|
||||
});
|
||||
```
|
||||
|
||||
### Merging with existing Hasura schema and serving it
|
||||
|
||||
Merge these custom resolvers with the Hasura GraphQL Engine by using the
|
||||
`mergeSchemas()` function from the `graphql-tools` library.
|
||||
|
||||
```js
|
||||
import { mergeSchemas } from 'graphql-tools';
|
||||
|
||||
const newSchema = mergeSchemas({
|
||||
schemas: [
|
||||
executableCustomSchema,
|
||||
executableHasuraSchema
|
||||
]
|
||||
});
|
||||
|
||||
const server = new ApolloServer({
|
||||
schema: newSchema
|
||||
});
|
||||
|
||||
server.listen().then(({ url }) => {
|
||||
console.log(`Server running at ${url}`);
|
||||
});
|
||||
```
|
||||
|
||||
Check [this file](src/index.js) to see how it is done.
|
12
community/boilerplates/custom-resolvers/app.json
Normal file
12
community/boilerplates/custom-resolvers/app.json
Normal file
@ -0,0 +1,12 @@
|
||||
{
|
||||
"name": "Custom Resolvers boilerplate",
|
||||
"description": "Custom resolvers boilerplate for Hasura GraphQL Engine",
|
||||
"logo": "https://storage.googleapis.com/hasura-graphql-engine/console/assets/favicon.png",
|
||||
"keywords": [
|
||||
"graphql",
|
||||
"heroku",
|
||||
"postgres",
|
||||
"hasura"
|
||||
],
|
||||
"repository": "https://github.com/hasura/graphql-engine/community/boilerplates/custom-resolvers"
|
||||
}
|
Binary file not shown.
After Width: | Height: | Size: 60 KiB |
2845
community/boilerplates/custom-resolvers/package-lock.json
generated
Normal file
2845
community/boilerplates/custom-resolvers/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
25
community/boilerplates/custom-resolvers/package.json
Normal file
25
community/boilerplates/custom-resolvers/package.json
Normal file
@ -0,0 +1,25 @@
|
||||
{
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/hasura/graphql-engine/community/boilerplates/custom-resolvers"
|
||||
},
|
||||
"scripts": {
|
||||
"start": "node -r esm src/index.js"
|
||||
},
|
||||
"dependencies": {
|
||||
"apollo-link-http": "^1.5.4",
|
||||
"apollo-link-ws": "^1.0.8",
|
||||
"apollo-server": "^2.0.6",
|
||||
"apollo-server-express": "^1.3.6",
|
||||
"apollo-utilities": "^1.0.20",
|
||||
"esm": "^3.0.61",
|
||||
"graphql": "^0.13.2",
|
||||
"graphql-tools-with-subscriptions": "^1.0.0",
|
||||
"knex": "^0.15.2",
|
||||
"node-fetch": "^2.1.2",
|
||||
"pg": "^7.4.3",
|
||||
"subscriptions-transport-ws": "^0.9.14",
|
||||
"ws": "^6.0.0"
|
||||
}
|
||||
}
|
@ -0,0 +1,45 @@
|
||||
import knex from 'knex';
|
||||
import pg from 'pg';
|
||||
|
||||
const { PG_CONNECTION_STRING } = process.env;
|
||||
|
||||
// create a knex client to connect to directly connect to postgres
|
||||
pg.defaults.ssl = true;
|
||||
const knexClient = knex({
|
||||
client: 'pg',
|
||||
connection: PG_CONNECTION_STRING
|
||||
});
|
||||
|
||||
let count = 0;
|
||||
|
||||
// custom resolvers
|
||||
const resolvers = {
|
||||
// resolvers for queries
|
||||
Query: {
|
||||
hello: (root, args, context, info) => {
|
||||
// return response
|
||||
return 'Hello world!';
|
||||
},
|
||||
count: (root, args, context, info) => {
|
||||
// return response
|
||||
return count;
|
||||
},
|
||||
user_average_age: async (root, args, context, info) => {
|
||||
// make SQL query using knex client
|
||||
const response = await knexClient('user')
|
||||
.avg('age');
|
||||
// return response
|
||||
return response[0].avg;
|
||||
}
|
||||
},
|
||||
|
||||
// resolvers for mutations
|
||||
Mutation: {
|
||||
increment_counter: (root, args, context, info) => {
|
||||
// return response
|
||||
return { new_count: ++count };
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
export default resolvers;
|
@ -0,0 +1,18 @@
|
||||
// custom type definitions
|
||||
const typeDefs = `
|
||||
type Query {
|
||||
hello: String,
|
||||
count: Int,
|
||||
user_average_age: Float
|
||||
}
|
||||
|
||||
type Mutation {
|
||||
increment_counter: MutationResp
|
||||
}
|
||||
|
||||
type MutationResp {
|
||||
new_count: Int
|
||||
}
|
||||
`;
|
||||
|
||||
export default typeDefs;
|
53
community/boilerplates/custom-resolvers/src/index.js
Normal file
53
community/boilerplates/custom-resolvers/src/index.js
Normal file
@ -0,0 +1,53 @@
|
||||
import fetch from 'node-fetch';
|
||||
import { ApolloServer } from 'apollo-server';
|
||||
import { mergeSchemas, makeExecutableSchema } from 'graphql-tools';
|
||||
import { getRemoteSchema } from './utils';
|
||||
import typeDefs from './customTypeDefs';
|
||||
import resolvers from './customResolvers';
|
||||
|
||||
const HASURA_GRAPHQL_ENGINE_URL = process.env.HASURA_GRAPHQL_ENGINE_URL || `https://bazookaand.herokuapp.com`;
|
||||
const HASURA_GRAPHQL_API_URL = HASURA_GRAPHQL_ENGINE_URL + '/v1alpha1/graphql';
|
||||
const ACCESS_KEY = process.env.X_HASURA_ACCESS_KEY;
|
||||
|
||||
const runServer = async () => {
|
||||
|
||||
// make Hasura schema
|
||||
const executableHasuraSchema = await getRemoteSchema(
|
||||
HASURA_GRAPHQL_API_URL,
|
||||
ACCESS_KEY && { 'x-hasura-access-key': ACCESS_KEY }
|
||||
);
|
||||
|
||||
// make executable schema out of custom resolvers and typedefs
|
||||
const executableCustomSchema = makeExecutableSchema({
|
||||
typeDefs,
|
||||
resolvers,
|
||||
});
|
||||
|
||||
// merge custom resolvers with Hasura schema
|
||||
const finalSchema = mergeSchemas({
|
||||
schemas: [
|
||||
executableCustomSchema,
|
||||
executableHasuraSchema,
|
||||
]
|
||||
});
|
||||
|
||||
// instantiate a server instance
|
||||
const server = new ApolloServer({
|
||||
schema: finalSchema,
|
||||
introspection: true,
|
||||
playground: false
|
||||
});
|
||||
|
||||
// run the server
|
||||
server.listen({
|
||||
port: process.env.PORT || 4000
|
||||
}).then(({url}) => {
|
||||
console.log('Server running. Open ' + url + ' to run queries.');
|
||||
});
|
||||
}
|
||||
|
||||
try {
|
||||
runServer();
|
||||
} catch (e) {
|
||||
console.log(e, e.message, e.stack);
|
||||
}
|
62
community/boilerplates/custom-resolvers/src/utils.js
Normal file
62
community/boilerplates/custom-resolvers/src/utils.js
Normal file
@ -0,0 +1,62 @@
|
||||
import { HttpLink } from 'apollo-link-http';
|
||||
import { WebSocketLink } from 'apollo-link-ws';
|
||||
import { SubscriptionClient } from 'subscriptions-transport-ws';
|
||||
import ws from 'ws';
|
||||
import { makeRemoteExecutableSchema, introspectSchema } from 'graphql-tools';
|
||||
import fetch from 'node-fetch';
|
||||
import { split } from 'apollo-link';
|
||||
import { getMainDefinition } from 'apollo-utilities';
|
||||
|
||||
const { HASURA_GRAPHQL_ENGINE_AUTH_HOOK } = process.env;
|
||||
|
||||
// util function to fetch and create remote schema
|
||||
export const getRemoteSchema = async (uri, headers) => {
|
||||
const link = makeHttpAndWsLink(uri, headers);
|
||||
const schema = await introspectSchema(link);
|
||||
return makeRemoteExecutableSchema({
|
||||
schema,
|
||||
link
|
||||
});
|
||||
};
|
||||
|
||||
/* create an apollo-link instance that makes
|
||||
WS connection for subscriptions and
|
||||
HTTP connection for queries andm utations
|
||||
*/
|
||||
const makeHttpAndWsLink = (uri, headers) => {
|
||||
|
||||
// Create an http link:
|
||||
const httpLink = new HttpLink({
|
||||
uri,
|
||||
fetch,
|
||||
headers
|
||||
});
|
||||
|
||||
|
||||
// Create a WebSocket link:
|
||||
const wsLink = new WebSocketLink(new SubscriptionClient(
|
||||
uri,
|
||||
{
|
||||
reconnect: true,
|
||||
connectionParams: {
|
||||
headers
|
||||
}
|
||||
},
|
||||
ws
|
||||
));
|
||||
|
||||
// chose the link to use based on operation
|
||||
const link = split(
|
||||
// split based on operation type
|
||||
({ query }) => {
|
||||
const { kind, operation } = getMainDefinition(query);
|
||||
return kind === 'OperationDefinition' && operation === 'subscription';
|
||||
},
|
||||
wsLink,
|
||||
httpLink,
|
||||
);
|
||||
|
||||
|
||||
return link;
|
||||
};
|
||||
|
2
community/boilerplates/serverless-triggers/.gitignore
vendored
Normal file
2
community/boilerplates/serverless-triggers/.gitignore
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
.DS_Store
|
||||
*.temp
|
69
community/boilerplates/serverless-triggers/README.md
Normal file
69
community/boilerplates/serverless-triggers/README.md
Normal file
@ -0,0 +1,69 @@
|
||||
# Event trigger boilerplates for serverless cloud functions
|
||||
|
||||
This repository contains boilerplate functions for various use-cases, for different serverless or cloud function platforms. These functions implement sample use-cases of the different types of asynchronous business logic operations that can be triggered by the Hasura GraphQL Engine on database insert, update or delete.
|
||||
|
||||
Examples in this repository support the following cloud function platforms:
|
||||
|
||||
* AWS Lambda
|
||||
|
||||
* Google Cloud Functions
|
||||
|
||||
* Microsoft Azure Functions
|
||||
|
||||
* Zeit serverless docker
|
||||
|
||||
Note: *If you want to add support for other platforms, please submit a PR or create an issue and tag it with `help-wanted`*
|
||||
|
||||
|
||||
## Events Trigger and Serverless functions architecture
|
||||
|
||||
![Architecture diagram](assets/basic-event-triggers-arch-diagram.png)
|
||||
|
||||
## Setup Postgres + Hasura GraphQL engine
|
||||
This boilerplate code assumes that you already have a HGE instance running.
|
||||
|
||||
If not you can visit the [docs](https://docs.hasura.io/1.0/graphql/manual/getting-started/index.html) and setup Postgres + HGE.
|
||||
|
||||
## Deploying the boilerplate examples
|
||||
|
||||
Follow the cloud function provider docs for setting these up. Provider specific README also contains CLI instructions.
|
||||
|
||||
## Documented examples
|
||||
|
||||
* A simple example to echo the trigger payload.
|
||||
|
||||
* Make a GraphQL mutation on some data update/insert, from the serverless function (*write related data back into the database*).
|
||||
|
||||
* Asynchronously send a FCM/APNS push notification.
|
||||
|
||||
* ETL or data transformation: transform the trigger payload and update an algolia index.
|
||||
|
||||
Note: Some of the examples have a corresponding `great-first-issue` issue in the repository. Please checkout the checklist in the README in the cloud provider folders for such issues.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
Boilerplates have been organised into top-level folders for each cloud function platform. Inside each such folder, there's a folder for each language, followed by each use-case. The README for each cloud function provider has a list of available boilerplates.
|
||||
|
||||
The following is a representative tree of the folder structure:
|
||||
|
||||
.
|
||||
├── aws-lambda
|
||||
| |── README.md
|
||||
| |── nodejs
|
||||
| | |── echo
|
||||
| | |── mutation
|
||||
| |── python
|
||||
| | ...
|
||||
|
|
||||
├── azure-functions
|
||||
| |── README.md
|
||||
| |── nodejs
|
||||
| | |── echo
|
||||
| | |── mutation
|
||||
| |── python
|
||||
| | ...
|
||||
|
||||
|
||||
## Contributing and boilerplate requests
|
||||
|
||||
Want to contribute to this repo? Issues marked `good-first-issue` or `help-wanted` are a good place to begin. Please submit a PR for new boilerplates (other use-cases or cloud function providers/platforms like Apache OpenWhisk, etc.). You can also create issues to request new boilerplates.
|
Binary file not shown.
After Width: | Height: | Size: 21 KiB |
@ -0,0 +1,27 @@
|
||||
# Boilerplates for AWS Lambda serverless functions and Hasura GraphQL Engine's Event Triggers
|
||||
|
||||
Sample cloud functions that can be triggered on changes in the database using GraphQL Engine's Event Triggers.
|
||||
|
||||
These are organized in language-specific folders.
|
||||
|
||||
**NOTE**
|
||||
Some of the language/platforms are work in progress. We welcome contributions for the WIP langauages. See issues and the following checklist:
|
||||
|
||||
| Folder name | Use-case| Node.js(6) | Python | Java | Go | C#
|
||||
|-------------|---------|:--------:|:------:|:----:|:---:|:---:
|
||||
| echo | echo the trigger payload | ✅ | ✅ | ❌ | ❌ | ❌
|
||||
| mutation | insert related data on an insert event using graphql mutation | ✅ | ✅ | ❌ | ❌ | ❌
|
||||
| push-notification | send push notification on database event | ❌ | ❌ | ❌ | ❌ | ❌
|
||||
| etl | transform the trigger payload and update an algolia index | ❌ | ❌ | ❌ | ❌ | ❌
|
||||
|
||||
|
||||
|
||||
## Pre-requisites
|
||||
|
||||
1. AWS account with billing enabled
|
||||
2. Hasura GraphQL Engine
|
||||
|
||||
### AWS setup
|
||||
You need to create a corresponding AWS Lambda for each of these examples.
|
||||
|
||||
As the Hasura event system takes webhooks as the event triggers, we need to expose these Lambdas as webhooks. To do that, we need to add the API gateway trigger to each lambda and add an API to it.
|
@ -0,0 +1,23 @@
|
||||
# Setup tables
|
||||
1. Create table:
|
||||
|
||||
```
|
||||
notes:
|
||||
id: int
|
||||
note: text
|
||||
```
|
||||
|
||||
# Setup AWS Lambda
|
||||
Create a lambda function in AWS. This will be our webhook.
|
||||
|
||||
1. Create a function.
|
||||
2. Select Node.js 6 as the runtime.
|
||||
3. Select "start from scratch".
|
||||
4. Add API gateway as a trigger.
|
||||
5. Add an API to API gateway.
|
||||
6. Add the code in `index.js`. The handler function of your lambda will be the `index.handler`.
|
||||
|
||||
# Add the trigger in Hasura GraphQL
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste the API endpoint of your AWS lambda as the webhook.
|
@ -0,0 +1,28 @@
|
||||
// Lambda which just echoes back the event data
|
||||
|
||||
exports.handler = (event, context, callback) => {
|
||||
let request;
|
||||
try {
|
||||
request = JSON.parse(event.body);
|
||||
} catch (e) {
|
||||
return callback(null, {statusCode: 400, body: "cannot parse hasura event"});
|
||||
}
|
||||
|
||||
let response = {
|
||||
statusCode: 200,
|
||||
body: ''
|
||||
};
|
||||
console.log(request);
|
||||
|
||||
if (request.table.name === "notes" && request.event.op === "INSERT") {
|
||||
response.body = `New note ${request.event.data.new.id} inserted, with data: ${request.event.data.new.note}`;
|
||||
}
|
||||
else if (request.table.name === "notes" && request.event.op === "UPDATE") {
|
||||
response.body = `Note ${request.event.data.new.id} updated, with data: ${request.event.data.new.note}`;
|
||||
}
|
||||
else if (request.table.name === "notes" && request.event.op === "DELETE") {
|
||||
response.body = `Note ${request.event.data.old.id} deleted, with data: ${request.event.data.old.note}`;
|
||||
}
|
||||
|
||||
callback(null, response);
|
||||
};
|
2
community/boilerplates/serverless-triggers/aws-lambda/nodejs6/mutation/.gitignore
vendored
Normal file
2
community/boilerplates/serverless-triggers/aws-lambda/nodejs6/mutation/.gitignore
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
node_modules
|
||||
*.zip
|
@ -0,0 +1,40 @@
|
||||
# Setup tables
|
||||
|
||||
1. Create the table using the console:
|
||||
|
||||
```
|
||||
Table name: notes
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
|
||||
Table name: note_revision
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
note_id: Integer (foreign key to notes.id)
|
||||
update_at: Timestamp, default `now()`
|
||||
|
||||
```
|
||||
|
||||
# Setup AWS Lambda
|
||||
Create a lambda function in AWS. This will be our webhook.
|
||||
|
||||
1. In this folder, run `npm install`
|
||||
2. Then create a zip: `zip -r hge-mutation.zip .`
|
||||
3. Create a Lambda function.
|
||||
4. Select Node.js 6 as the runtime.
|
||||
5. Select "start from scratch".
|
||||
6. Add API gateway as a trigger.
|
||||
7. Add an API to API gateway.
|
||||
8. Upload the zip from previous step. The handler function of your lambda will be `index.handler`.
|
||||
9. Add the following enviroment variables in your lambda config:
|
||||
1. `ACCESS_KEY`: this is the access key you configured when you setup HGE.
|
||||
2. `HGE_ENDPOINT`: the URL on which you HGE instance is running.
|
||||
|
||||
# Add the trigger in Hasura GraphQL
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste the API endpoint of your AWS lambda as the webhook.
|
@ -0,0 +1,44 @@
|
||||
// Lambda which gets triggered on insert, and in turns performs a mutation
|
||||
|
||||
const fetch = require('node-fetch');
|
||||
|
||||
const accessKey = process.env.ACCESS_KEY;
|
||||
const hgeEndpoint = process.env.HGE_ENDPOINT;
|
||||
|
||||
const query = `
|
||||
mutation updateNoteRevision ($noteId: Int!, $data: String!) {
|
||||
insert_note_revision (objects: [
|
||||
{
|
||||
note_id: $noteId,
|
||||
note: $data
|
||||
}
|
||||
]) {
|
||||
affected_rows
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
exports.handler = (event, context, callback) => {
|
||||
let request;
|
||||
try {
|
||||
request = JSON.parse(event.body);
|
||||
} catch (e) {
|
||||
return callback(null, {statusCode: 400, body: "cannot parse hasura event"});
|
||||
}
|
||||
|
||||
const response = {
|
||||
statusCode: 200,
|
||||
body: "success"
|
||||
};
|
||||
const qv = {noteId: request.event.data.old.id, data: request.event.data.old.note};
|
||||
fetch(hgeEndpoint + '/v1alpha1/graphql', {
|
||||
method: 'POST',
|
||||
body: JSON.stringify({query: query, variables: qv}),
|
||||
headers: {'Content-Type': 'application/json', 'x-hasura-access-key': accessKey},
|
||||
})
|
||||
.then(res => res.json())
|
||||
.then(json => {
|
||||
console.log(json);
|
||||
callback(null, response);
|
||||
});
|
||||
};
|
13
community/boilerplates/serverless-triggers/aws-lambda/nodejs6/mutation/package-lock.json
generated
Normal file
13
community/boilerplates/serverless-triggers/aws-lambda/nodejs6/mutation/package-lock.json
generated
Normal file
@ -0,0 +1,13 @@
|
||||
{
|
||||
"name": "mutation-trigger",
|
||||
"version": "1.0.0",
|
||||
"lockfileVersion": 1,
|
||||
"requires": true,
|
||||
"dependencies": {
|
||||
"node-fetch": {
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.2.0.tgz",
|
||||
"integrity": "sha512-OayFWziIxiHY8bCUyLX6sTpDH8Jsbp4FfYd1j1f7vZyfgkcOnAyM4oQR16f8a0s7Gl/viMGRey8eScYk4V4EZA=="
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,14 @@
|
||||
{
|
||||
"name": "mutation-trigger",
|
||||
"version": "1.0.0",
|
||||
"description": " sample lambda to react to database updates",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1"
|
||||
},
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"node-fetch": "^2.2.0"
|
||||
}
|
||||
}
|
@ -0,0 +1,23 @@
|
||||
# Setup tables
|
||||
1. Create table:
|
||||
|
||||
```
|
||||
notes:
|
||||
id: int
|
||||
note: text
|
||||
```
|
||||
|
||||
# Setup AWS Lambda
|
||||
Create a lambda function in AWS. This will be our webhook.
|
||||
|
||||
1. Create a function.
|
||||
2. Select Python 3.6 as the runtime.
|
||||
3. Select "start from scratch".
|
||||
4. Add API gateway as a trigger.
|
||||
5. Add an API to API gateway.
|
||||
6. Add the code in `echo.py`. The handler function of your lambda will be the `echo.lambda_handler`.
|
||||
|
||||
# Add the trigger in Hasura GraphQL
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste the API endpoint of your AWS lambda as the webhook.
|
@ -0,0 +1,26 @@
|
||||
import json
|
||||
|
||||
def lambda_handler(event, context):
|
||||
try:
|
||||
body = json.loads(event['body'])
|
||||
except:
|
||||
return {
|
||||
"statusCode": 400,
|
||||
"body": json.dumps({'message': 'Unable to parse hasura event'})
|
||||
}
|
||||
|
||||
message = 'Not able to process request'
|
||||
data = body['event']['data']
|
||||
|
||||
if body['table']['name'] == 'notes' and body['event']['op'] == 'INSERT':
|
||||
message = 'New note {} inserted, with data: {}'.format(data['new']['id'], data['new']['note'])
|
||||
|
||||
elif body['table']['name'] == 'notes' and body['event']['op'] == 'UPDATE':
|
||||
message = 'Note {} updated, with data: {}'.format(data['new']['id'], data['new']['note'])
|
||||
|
||||
elif body['table'] == 'notes' and body['op'] == 'DELETE':
|
||||
message = 'Note {} deleted, with data: {}'.format(data['old']['id'], data['old']['note'])
|
||||
return {
|
||||
"statusCode": 200,
|
||||
"body": json.dumps({'message': message})
|
||||
}
|
@ -0,0 +1,36 @@
|
||||
# Setup tables
|
||||
1. Create the following tables using the console:
|
||||
|
||||
```
|
||||
Table name: notes
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
|
||||
Table name: note_revision
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
note_id: Integer (foreign key to notes.id)
|
||||
update_at: Timestamp, default `now()`
|
||||
```
|
||||
|
||||
# Setup AWS Lambda
|
||||
Create a lambda function in AWS. This will be our webhook.
|
||||
|
||||
1. Create a function.
|
||||
2. Select Python 3.6 as the runtime.
|
||||
3. Select "start from scratch".
|
||||
4. Add API gateway as a trigger.
|
||||
5. Add an API to API gateway.
|
||||
6. Add the code in `mutation.py`. The handler function of your lambda will be the `mutation.lambda_handler`.
|
||||
7. Add the following enviroment variables in your lambda config:
|
||||
1. `ACCESS_KEY`: this is the access key you configured when you setup HGE.
|
||||
2. `HGE_ENDPOINT`: the URL on which you HGE instance is running.
|
||||
|
||||
# Add the trigger in Hasura GraphQL
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste the API endpoint of your AWS lambda as the webhook.
|
@ -0,0 +1,45 @@
|
||||
import os
|
||||
import json
|
||||
from botocore.vendored import requests
|
||||
|
||||
ACCESS_KEY = os.environ['ACCESS_KEY']
|
||||
HGE_ENDPOINT = os.environ['HGE_ENDPOINT']
|
||||
HGE_URL = HGE_ENDPOINT + '/v1alpha1/graphql'
|
||||
|
||||
HEADERS = {
|
||||
'Content-Type': 'application/json',
|
||||
'X-Hasura-Access-Key': ACCESS_KEY,
|
||||
}
|
||||
|
||||
query = """
|
||||
mutation updateNoteRevision ($noteId: Int!, $data: String!) {
|
||||
insert_note_revision (objects: [
|
||||
{
|
||||
note_id: $noteId,
|
||||
note: $data
|
||||
}
|
||||
]) {
|
||||
affected_rows
|
||||
}
|
||||
}
|
||||
"""
|
||||
|
||||
def lambda_handler(event, context):
|
||||
try:
|
||||
body = json.loads(event['body'])
|
||||
except:
|
||||
return {
|
||||
"statusCode": 400,
|
||||
"body": json.dumps({'message': 'Unable to parse request body'})
|
||||
}
|
||||
data = body['event']['data']
|
||||
qv = {'noteId': data['old']['id'], 'data': data['old']['note']}
|
||||
jsonBody = {'query': query, 'variables': qv}
|
||||
|
||||
resp = requests.post(HGE_URL, data=json.dumps(jsonBody), headers=HEADERS)
|
||||
my_json = resp.json()
|
||||
print(my_json)
|
||||
return {
|
||||
"statusCode": 200,
|
||||
"body": json.dumps({'message': 'success'})
|
||||
}
|
@ -0,0 +1,17 @@
|
||||
# Boilerplates for Azure Cloud Function serverless functions and Hasura GraphQL Engine's Event Triggers
|
||||
|
||||
**NOTE**
|
||||
Some of the language/platforms are work in progress. We welcome contributions for the WIP langauages. See issues and the following checklist:
|
||||
|
||||
| Folder name | Use-case| Javascript | Java | C# | F#
|
||||
|-------------|---------|:--------:|:------:|:----:|:---:
|
||||
| echo | echo the trigger payload | ✅ | ❌ | ❌ | ❌
|
||||
| mutation | insert related data on an insert event using graphql mutation | ✅ | ❌ | ❌ | ❌
|
||||
| push-notification | send push notification on database event | ❌ | ❌ | ❌ | ❌
|
||||
| etl | transform the trigger payload and update an algolia index | ❌ | ❌ | ❌ | ❌
|
||||
|
||||
## Pre-requisites
|
||||
1. Running instance of Hasura GraphQL
|
||||
2. You already have a Azure account with billing enabled.
|
||||
3. Install [azure-cli](https://github.com/Azure/azure-cli)
|
||||
4. Install [azure-functions-core-tools](https://github.com/Azure/azure-functions-core-tools)
|
25
community/boilerplates/serverless-triggers/azure-functions/nodejs/echo/.gitignore
vendored
Normal file
25
community/boilerplates/serverless-triggers/azure-functions/nodejs/echo/.gitignore
vendored
Normal file
@ -0,0 +1,25 @@
|
||||
bin
|
||||
obj
|
||||
csx
|
||||
.vs
|
||||
edge
|
||||
Publish
|
||||
|
||||
*.user
|
||||
*.suo
|
||||
*.cscfg
|
||||
*.Cache
|
||||
project.lock.json
|
||||
|
||||
/packages
|
||||
/TestResults
|
||||
|
||||
/tools/NuGet.exe
|
||||
/App_Data
|
||||
/secrets
|
||||
/data
|
||||
.secrets
|
||||
appsettings.json
|
||||
local.settings.json
|
||||
|
||||
node_modules
|
@ -0,0 +1,5 @@
|
||||
{
|
||||
"recommendations": [
|
||||
"ms-azuretools.vscode-azurefunctions"
|
||||
]
|
||||
}
|
@ -0,0 +1,20 @@
|
||||
{
|
||||
"disabled": false,
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": [
|
||||
"get",
|
||||
"post"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "res"
|
||||
}
|
||||
]
|
||||
}
|
@ -0,0 +1,17 @@
|
||||
module.exports = function (context, req) {
|
||||
context.log('JavaScript HTTP trigger function processed a request.');
|
||||
|
||||
if (req.body && req.body) {
|
||||
context.res = {
|
||||
// status: 200, /* Defaults to 200 */
|
||||
body: req.body
|
||||
};
|
||||
}
|
||||
else {
|
||||
context.res = {
|
||||
status: 400,
|
||||
body: "Please pass a request body"
|
||||
};
|
||||
}
|
||||
context.done();
|
||||
};
|
@ -0,0 +1,3 @@
|
||||
{
|
||||
"name": "Azure"
|
||||
}
|
@ -0,0 +1,35 @@
|
||||
# Setup tables
|
||||
1. Create table:
|
||||
|
||||
```
|
||||
notes:
|
||||
id: int
|
||||
note: text
|
||||
```
|
||||
|
||||
# Setup Cloud Function
|
||||
1. Run the following commands to deploy:
|
||||
```bash
|
||||
az group create --name 'my-functions-group' --location southindia
|
||||
|
||||
az storage account create --name 'myfunctionsstorage' --location southindia --resource-group 'my-functions-group' --sku Standard_LRS
|
||||
|
||||
az functionapp create --name 'myfunctionsapp' --storage-account 'myfunctionsstorage' --resource-group 'my-functions-group' --consumption-plan-location southindia
|
||||
|
||||
func azure login
|
||||
func azure subscriptions set 'Free Trial'
|
||||
func azure functionapp publish 'myfunctionsapp'
|
||||
```
|
||||
2. Set Environment variables `ACCESS_KEY` and `HGE_ENDPOINT`
|
||||
3. Add a X-Function-Key header if Authorization level is enabled
|
||||
|
||||
# Running locally
|
||||
`func host start`
|
||||
|
||||
# Check Logs
|
||||
`func azure functionapp logstream 'myfunctionsapp'`
|
||||
|
||||
# Add the trigger in Hasura GraphQL
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste your function URL as the webhook.
|
@ -0,0 +1,11 @@
|
||||
{
|
||||
"functions": [ "HTTPTrigger" ],
|
||||
"id": "3adb8c2ca78f4171bab74dfc9c600a2f",
|
||||
"functionTimeout": "00:00:30",
|
||||
"http": {
|
||||
"routePrefix": ""
|
||||
},
|
||||
"tracing": {
|
||||
"consoleLevel": "verbose"
|
||||
}
|
||||
}
|
25
community/boilerplates/serverless-triggers/azure-functions/nodejs/mutation/.gitignore
vendored
Normal file
25
community/boilerplates/serverless-triggers/azure-functions/nodejs/mutation/.gitignore
vendored
Normal file
@ -0,0 +1,25 @@
|
||||
bin
|
||||
obj
|
||||
csx
|
||||
.vs
|
||||
edge
|
||||
Publish
|
||||
|
||||
*.user
|
||||
*.suo
|
||||
*.cscfg
|
||||
*.Cache
|
||||
project.lock.json
|
||||
|
||||
/packages
|
||||
/TestResults
|
||||
|
||||
/tools/NuGet.exe
|
||||
/App_Data
|
||||
/secrets
|
||||
/data
|
||||
.secrets
|
||||
appsettings.json
|
||||
local.settings.json
|
||||
|
||||
node_modules
|
@ -0,0 +1,5 @@
|
||||
{
|
||||
"recommendations": [
|
||||
"ms-azuretools.vscode-azurefunctions"
|
||||
]
|
||||
}
|
@ -0,0 +1,20 @@
|
||||
{
|
||||
"disabled": false,
|
||||
"bindings": [
|
||||
{
|
||||
"authLevel": "anonymous",
|
||||
"type": "httpTrigger",
|
||||
"direction": "in",
|
||||
"name": "req",
|
||||
"methods": [
|
||||
"get",
|
||||
"post"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "http",
|
||||
"direction": "out",
|
||||
"name": "res"
|
||||
}
|
||||
]
|
||||
}
|
@ -0,0 +1,61 @@
|
||||
const { query } = require('graphqurl');
|
||||
const ACCESS_KEY = process.env.ACCESS_KEY;
|
||||
const HGE_ENDPOINT = process.env.HGE_ENDPOINT;
|
||||
|
||||
const MUTATION_NOTE_REVISION = `
|
||||
mutation updateNoteRevision ($noteId: Int!, $data: String!) {
|
||||
insert_note_revision (objects: [
|
||||
{
|
||||
note_id: $noteId,
|
||||
note: $data
|
||||
}
|
||||
]) {
|
||||
affected_rows
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
module.exports = function (context, req) {
|
||||
context.log('JavaScript HTTP trigger function processed a request.');
|
||||
try {
|
||||
context.log('Inside');
|
||||
const { event: {op, data}, table } = req.body;
|
||||
context.log(data);
|
||||
context.log(data.new.id);
|
||||
const qv = {noteId: data.old.id, data: data.old.note};
|
||||
const hgeResponse = query({
|
||||
query: MUTATION_NOTE_REVISION,
|
||||
endpoint: HGE_ENDPOINT + '/v1alpha1/graphql',
|
||||
variables: qv,
|
||||
headers: {
|
||||
'x-hasura-access-key': ACCESS_KEY
|
||||
}
|
||||
}).then((response) => {
|
||||
context.log(response);
|
||||
context.log('After query');
|
||||
context.res = {
|
||||
body: {
|
||||
error: false,
|
||||
data: response
|
||||
}
|
||||
};
|
||||
context.done();
|
||||
}).catch((error) => {
|
||||
console.error(JSON.stringify(error));
|
||||
context.res = {
|
||||
status: 500,
|
||||
body: {
|
||||
error: true,
|
||||
data: JSON.stringify(error)
|
||||
}
|
||||
};
|
||||
context.done();
|
||||
});
|
||||
} catch(e) {
|
||||
context.res = {
|
||||
status: 400,
|
||||
body: "An error occured."
|
||||
};
|
||||
context.done();
|
||||
}
|
||||
};
|
@ -0,0 +1,15 @@
|
||||
{
|
||||
"name": "cloudfunction",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1"
|
||||
},
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"graphqurl": "^0.3.1",
|
||||
"node-fetch": "^2.2.0"
|
||||
}
|
||||
}
|
@ -0,0 +1,3 @@
|
||||
{
|
||||
"name": "Azure"
|
||||
}
|
@ -0,0 +1,47 @@
|
||||
# Setup tables
|
||||
1. Create table:
|
||||
|
||||
```
|
||||
Table name: notes
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
|
||||
Table name: note_revision
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
note_id: Integer (foreign key to notes.id)
|
||||
update_at: Timestamp, default `now()`
|
||||
```
|
||||
|
||||
|
||||
# Setup Cloud Function
|
||||
1. Run the following commands to deploy:
|
||||
```bash
|
||||
az group create --name 'my-functions-group' --location southindia
|
||||
|
||||
az storage account create --name 'myfunctionsstorage' --location southindia --resource-group 'my-functions-group' --sku Standard_LRS
|
||||
|
||||
az functionapp create --name 'myfunctionsapp' --storage-account 'myfunctionsstorage' --resource-group 'my-functions-group' --consumption-plan-location southindia
|
||||
|
||||
func azure login
|
||||
func azure subscriptions set 'Free Trial'
|
||||
func azure functionapp publish 'myfunctionsapp'
|
||||
```
|
||||
|
||||
2. Set Environment variables `ACCESS_KEY` and `HGE_ENDPOINT`
|
||||
3. Add a X-Function-Key header if Authorization level is enabled
|
||||
|
||||
# Running locally
|
||||
`func host start`
|
||||
|
||||
# Check Logs
|
||||
`func azure functionapp logstream 'myfunctionsapp'`
|
||||
|
||||
# Add the trigger in Hasura GraphQL
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste your function URL as the webhook.
|
@ -0,0 +1,14 @@
|
||||
{
|
||||
"functions": [ "HTTPTrigger" ],
|
||||
"id": "3adb8c2ca78f4171bab74dfc9c600a2a",
|
||||
"functionTimeout": "00:00:30",
|
||||
"extensions": {
|
||||
"http": {
|
||||
"routePrefix": ""
|
||||
}
|
||||
},
|
||||
"logging": {
|
||||
"consoleLevel": "verbose"
|
||||
},
|
||||
"version": "2.0"
|
||||
}
|
@ -0,0 +1,25 @@
|
||||
# Boilerplates for Google Cloud Functions and Hasura GraphQL Engine's Event Triggers
|
||||
Sample cloud functions that can be triggered on changes in the database using GraphQL Engine's Event Triggers
|
||||
|
||||
**NOTE**
|
||||
Some of the language/platforms are work in progress. We welcome contributions for the WIP langauages. See issues and the following checklist:
|
||||
|
||||
| Folder name | Use-case| Node.js(8) | Node.js(6) | Python
|
||||
|-------------|---------|:--------:|:------:|:----:
|
||||
| echo | echo the trigger payload | ✅ | ❌ | ❌
|
||||
| mutation | insert related data on an insert event using graphql mutation | ✅ | ❌ | ❌
|
||||
| push-notification | send push notification on database event | ❌ | ❌ | ❌
|
||||
| etl | transform the trigger payload and update an algolia index | ❌ | ❌ | ❌
|
||||
|
||||
|
||||
## Pre-requisites
|
||||
|
||||
1. Google cloud account with billing enabled
|
||||
2. `gcloud` CLI
|
||||
3. Hasura GraphQL Engine
|
||||
|
||||
Get the `gcloud beta` component:
|
||||
|
||||
```bash
|
||||
gcloud components update && gcloud components install beta
|
||||
```
|
@ -0,0 +1,20 @@
|
||||
# This file specifies files that are *not* uploaded to Google Cloud Platform
|
||||
# using gcloud. It follows the same syntax as .gitignore, with the addition of
|
||||
# "#!include" directives (which insert the entries of the given .gitignore-style
|
||||
# file at that point).
|
||||
#
|
||||
# For more information, run:
|
||||
# $ gcloud topic gcloudignore
|
||||
#
|
||||
.gcloudignore
|
||||
# If you would like to upload your .git directory, .gitignore file or files
|
||||
# from your .gitignore file, remove the corresponding line
|
||||
# below:
|
||||
.git
|
||||
.gitignore
|
||||
|
||||
node_modules
|
||||
|
||||
.env.yaml
|
||||
.prod.env.yaml
|
||||
|
2
community/boilerplates/serverless-triggers/google-cloud-functions/nodejs8/echo/.gitignore
vendored
Normal file
2
community/boilerplates/serverless-triggers/google-cloud-functions/nodejs8/echo/.gitignore
vendored
Normal file
@ -0,0 +1,2 @@
|
||||
node_modules
|
||||
.prod.env.yaml
|
@ -0,0 +1,41 @@
|
||||
# Setup table
|
||||
|
||||
Create the table using the console:
|
||||
|
||||
```
|
||||
Table name: profile
|
||||
|
||||
Columns:
|
||||
|
||||
id: Integer auto-increment
|
||||
name: Text
|
||||
address: Text
|
||||
lat: Numeric, Nullable
|
||||
lng: Numeric, Nullable
|
||||
```
|
||||
|
||||
# Deploy Google Cloud function
|
||||
|
||||
Deploy the function:
|
||||
|
||||
```bash
|
||||
gcloud beta functions deploy nodejs-echo \
|
||||
--runtime nodejs8 \
|
||||
--trigger-http
|
||||
```
|
||||
|
||||
Get the trigger URL:
|
||||
```yaml
|
||||
httpsTrigger:
|
||||
url: https://us-central1-hasura-test.cloudfunctions.net/nodejs-echo
|
||||
```
|
||||
|
||||
Open Hasura console, goto `Events -> Add Trigger` and create a new trigger:
|
||||
```
|
||||
Trigger name: profile_change
|
||||
Schema/Table: public/profile
|
||||
Operations: Insert, Update, Delete
|
||||
Webhook URL: [Trigger URL]
|
||||
```
|
||||
|
||||
Once the trigger is created, goto `Data -> profile -> Insert row` and add a row.
|
@ -0,0 +1,6 @@
|
||||
exports.function = (req, res) => {
|
||||
const { event: {op, data}, table: {name, schema} } = req.body;
|
||||
const response = {message: 'received event', data: {op, data, name, schema}};
|
||||
console.log(response);
|
||||
res.json(response);
|
||||
};
|
@ -0,0 +1,2 @@
|
||||
GMAPS_API_KEY: '[GMAPS_API_KEY]'
|
||||
HASURA_GRAPHQL_ENGINE_URL: 'http://[HGE_IP]/v1alpha1/graphql'
|
@ -0,0 +1,19 @@
|
||||
# This file specifies files that are *not* uploaded to Google Cloud Platform
|
||||
# using gcloud. It follows the same syntax as .gitignore, with the addition of
|
||||
# "#!include" directives (which insert the entries of the given .gitignore-style
|
||||
# file at that point).
|
||||
#
|
||||
# For more information, run:
|
||||
# $ gcloud topic gcloudignore
|
||||
#
|
||||
.gcloudignore
|
||||
# If you would like to upload your .git directory, .gitignore file or files
|
||||
# from your .gitignore file, remove the corresponding line
|
||||
# below:
|
||||
.git
|
||||
.gitignore
|
||||
|
||||
node_modules
|
||||
|
||||
.env.yaml
|
||||
.prod.env.yaml
|
@ -0,0 +1,2 @@
|
||||
node_modules
|
||||
.prod.env.yaml
|
@ -0,0 +1,71 @@
|
||||
# Setup table
|
||||
|
||||
Create the table using the console:
|
||||
|
||||
```
|
||||
Table name: profile
|
||||
|
||||
Columns:
|
||||
|
||||
id: Integer auto-increment
|
||||
name: Text
|
||||
address: Text
|
||||
lat: Numeric, Nullable
|
||||
lng: Numeric, Nullable
|
||||
```
|
||||
|
||||
# Deploy Google Cloud function
|
||||
|
||||
We are going to use Google Maps API in our cloud function.
|
||||
|
||||
Enable Google Maps API and get an API key (we'll call it `GMAPS_API_KEY`) following [this
|
||||
guide](https://developers.google.com/maps/documentation/geocoding/start?hl=el#auth)
|
||||
|
||||
Check the `Places` box to get access to Geocoding API.
|
||||
|
||||
We'll follow [this guide](https://cloud.google.com/functions/docs/quickstart)
|
||||
and create a Cloud Function with NodeJS 8.
|
||||
|
||||
```bash
|
||||
gcloud components update &&
|
||||
gcloud components install beta
|
||||
```
|
||||
|
||||
Goto the `cloudfunction` directory:
|
||||
|
||||
```bash
|
||||
cd cloudfunction
|
||||
```
|
||||
|
||||
Edit `.env.yaml` and add values for the following as shown:
|
||||
```yaml
|
||||
# .env.yaml
|
||||
GMAPS_API_KEY: '[GMAPS_API_KEY]'
|
||||
HASURA_GRAPHQL_ENGINE_URL: 'http://[HGE_IP]/v1alpha1/graphql'
|
||||
```
|
||||
|
||||
```bash
|
||||
gcloud beta functions deploy trigger \
|
||||
--runtime nodejs8 \
|
||||
--trigger-http \
|
||||
--region asia-south1 \
|
||||
--env-vars-file .env.yaml
|
||||
```
|
||||
|
||||
Get the trigger URL:
|
||||
```yaml
|
||||
httpsTrigger:
|
||||
url: https://asia-south1-hasura-test.cloudfunctions.net/trigger
|
||||
```
|
||||
|
||||
Goto `HGE_IP` on browser, `Events -> Add Trigger` and create a new trigger:
|
||||
```
|
||||
Trigger name: profile_change
|
||||
Schema/Table: public/profile
|
||||
Operations: Insert
|
||||
Webhook URL: [Trigger URL]
|
||||
```
|
||||
|
||||
Once the trigger is created, goto `Data -> profile -> Insert row` and add a new
|
||||
profile with name and address, save. Goto `Browse rows` tabs to see lat and lng
|
||||
updated, by the cloud function.
|
@ -0,0 +1,46 @@
|
||||
const fetch = require('node-fetch');
|
||||
const { query } = require('graphqurl');
|
||||
|
||||
const MUTATION_UPDATE_LAT_LONG = `
|
||||
mutation updateLatLong($id: Int!, $lat: numeric! $lng: numeric!) {
|
||||
update_profile(
|
||||
where: {id: {_eq: $id}},
|
||||
_set: {
|
||||
lat: $lat,
|
||||
lng: $lng
|
||||
}
|
||||
) {
|
||||
affected_rows
|
||||
returning {
|
||||
id
|
||||
lat
|
||||
lng
|
||||
}
|
||||
}
|
||||
}`;
|
||||
|
||||
exports.trigger = async (req, res) => {
|
||||
const GMAPS_API_KEY = process.env.GMAPS_API_KEY;
|
||||
const HASURA_GRAPHQL_ENGINE_URL = process.env.HASURA_GRAPHQL_ENGINE_URL;
|
||||
|
||||
const { event: {op, data}, table } = req.body;
|
||||
|
||||
if (op === 'INSERT' && table.name === 'profile' && table.schema === 'public') {
|
||||
const { id, address } = data.new;
|
||||
const gmaps = await fetch(`https://maps.googleapis.com/maps/api/geocode/json?address=${address}&key=${GMAPS_API_KEY}`);
|
||||
const gmapsResponse = await gmaps.json();
|
||||
if (gmapsResponse.results && gmapsResponse.results.length === 0) {
|
||||
res.json({error: true, data: gmapsResponse, message: `no results for address ${address}`});
|
||||
return;
|
||||
}
|
||||
const { lat, lng } = gmapsResponse.results[0].geometry.location;
|
||||
const hgeResponse = await query({
|
||||
query: MUTATION_UPDATE_LAT_LONG,
|
||||
endpoint: HASURA_GRAPHQL_ENGINE_URL,
|
||||
variables: { id, lat, lng },
|
||||
});
|
||||
res.json({error: false, data: hgeResponse.data});
|
||||
} else {
|
||||
res.json({error: false, message: 'ignored event'});
|
||||
}
|
||||
};
|
@ -0,0 +1,15 @@
|
||||
{
|
||||
"name": "cloudfunction",
|
||||
"version": "1.0.0",
|
||||
"description": "",
|
||||
"main": "index.js",
|
||||
"scripts": {
|
||||
"test": "echo \"Error: no test specified\" && exit 1"
|
||||
},
|
||||
"author": "",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"graphqurl": "^0.3.1",
|
||||
"node-fetch": "^2.2.0"
|
||||
}
|
||||
}
|
1
community/boilerplates/serverless-triggers/zeit-serverless-docker/.gitignore
vendored
Normal file
1
community/boilerplates/serverless-triggers/zeit-serverless-docker/.gitignore
vendored
Normal file
@ -0,0 +1 @@
|
||||
node_modules
|
@ -0,0 +1,13 @@
|
||||
# Boilerplates for Zeit Serverless Docker functions and Hasura GraphQL Engine's Event Triggers
|
||||
|
||||
**NOTE**
|
||||
Some of the language/platforms are work in progress. We welcome contributions for the WIP langauages. See issues.
|
||||
|
||||
# Pre-requisite:
|
||||
|
||||
1. Running instance of Hasura GraphQL engine
|
||||
|
||||
# Setup `now` cli
|
||||
1. Create zeit account @ https://zeit.co/
|
||||
2. Download now cli from https://zeit.co/download#now-cli
|
||||
3. Login using now Login
|
@ -0,0 +1,4 @@
|
||||
*
|
||||
!package.json
|
||||
!package-lock.json
|
||||
!index.js
|
@ -0,0 +1,5 @@
|
||||
FROM node:10-alpine
|
||||
COPY package.json package-lock.json ./
|
||||
RUN npm ci
|
||||
COPY index.js .
|
||||
CMD ["node", "node_modules/.bin/micro"]
|
@ -0,0 +1,22 @@
|
||||
# Setup our tables
|
||||
|
||||
Create table using the console:
|
||||
|
||||
```
|
||||
Table name: notes
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
```
|
||||
|
||||
# Setup echo serverless function
|
||||
1. cd echo
|
||||
2. edit now.json to change app and alias name (Note: alias will be the domain for example setting alias to hge-events-zeit-node-echo will provide hge-events-zeit-node-echo.now.sh)
|
||||
3. Deploy the function by running `now && now alias && now remove <app-name> --safe -y`
|
||||
|
||||
# Add events in Hasura GraphQL
|
||||
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste the API endpoint as the webhook.
|
@ -0,0 +1,21 @@
|
||||
const {json, send} = require('micro');
|
||||
|
||||
module.exports = async (req, res) => {
|
||||
let js;
|
||||
try {
|
||||
js = await json(req);
|
||||
} catch (err) {
|
||||
send(res, 400, {'error': err.message});
|
||||
}
|
||||
let message = 'Not able to process request';
|
||||
|
||||
if (js.event.op == 'INSERT' && js.table.name == 'notes') {
|
||||
message = `New note ${js.event.data.new.id} inserted, with data: ${js.event.data.new.note}`;
|
||||
} else if (js.event.op == 'UPDATE' && js.table.name == 'notes') {
|
||||
message = `note ${js.event.data.new.id} updated, with data: ${js.event.data.new.note}`;
|
||||
} else if (js.event.op == 'DELETE' && js.table.name == 'notes') {
|
||||
message = `New note ${js.event.data.old.id} deleted, with data: ${js.event.data.old.note}`;
|
||||
}
|
||||
|
||||
send(res, 200, {'message': message});
|
||||
};
|
@ -0,0 +1,9 @@
|
||||
{
|
||||
"public": true,
|
||||
"type": "docker",
|
||||
"features": {
|
||||
"cloud": "v2"
|
||||
},
|
||||
"alias": "hge-events-zeit-node-echo",
|
||||
"name": "hge-events-zeit-node-echo"
|
||||
}
|
145
community/boilerplates/serverless-triggers/zeit-serverless-docker/nodejs/echo/package-lock.json
generated
Normal file
145
community/boilerplates/serverless-triggers/zeit-serverless-docker/nodejs/echo/package-lock.json
generated
Normal file
@ -0,0 +1,145 @@
|
||||
{
|
||||
"name": "hge-events-zeit-node",
|
||||
"requires": true,
|
||||
"lockfileVersion": 1,
|
||||
"dependencies": {
|
||||
"ansi-styles": {
|
||||
"version": "3.2.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-3.2.1.tgz",
|
||||
"integrity": "sha512-VT0ZI6kZRdTh8YyJw3SMbYm/u+NqfsAxEpWO0Pf9sq8/e94WxxOpPKx9FR1FlyCtOVDNOQ+8ntlqFxiRc+r5qA==",
|
||||
"requires": {
|
||||
"color-convert": "^1.9.0"
|
||||
}
|
||||
},
|
||||
"arg": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/arg/-/arg-2.0.0.tgz",
|
||||
"integrity": "sha512-XxNTUzKnz1ctK3ZIcI2XUPlD96wbHP2nGqkPKpvk/HNRlPveYrXIVSTk9m3LcqOgDPg3B1nMvdV/K8wZd7PG4w=="
|
||||
},
|
||||
"bytes": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/bytes/-/bytes-3.0.0.tgz",
|
||||
"integrity": "sha1-0ygVQE1olpn4Wk6k+odV3ROpYEg="
|
||||
},
|
||||
"chalk": {
|
||||
"version": "2.4.0",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-2.4.0.tgz",
|
||||
"integrity": "sha512-Wr/w0f4o9LuE7K53cD0qmbAMM+2XNLzR29vFn5hqko4sxGlUsyy363NvmyGIyk5tpe9cjTr9SJYbysEyPkRnFw==",
|
||||
"requires": {
|
||||
"ansi-styles": "^3.2.1",
|
||||
"escape-string-regexp": "^1.0.5",
|
||||
"supports-color": "^5.3.0"
|
||||
}
|
||||
},
|
||||
"color-convert": {
|
||||
"version": "1.9.3",
|
||||
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-1.9.3.tgz",
|
||||
"integrity": "sha512-QfAUtd+vFdAtFQcC8CCyYt1fYWxSqAiK2cSD6zDB8N3cpsEBAvRxp9zOGg6G/SHHJYAT88/az/IuDGALsNVbGg==",
|
||||
"requires": {
|
||||
"color-name": "1.1.3"
|
||||
}
|
||||
},
|
||||
"color-name": {
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.3.tgz",
|
||||
"integrity": "sha1-p9BVi9icQveV3UIyj3QIMcpTvCU="
|
||||
},
|
||||
"content-type": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/content-type/-/content-type-1.0.4.tgz",
|
||||
"integrity": "sha512-hIP3EEPs8tB9AT1L+NUqtwOAps4mk2Zob89MWXMHjHWg9milF/j4osnnQLXBCBFBk/tvIG/tUc9mOUJiPBhPXA=="
|
||||
},
|
||||
"depd": {
|
||||
"version": "1.1.1",
|
||||
"resolved": "https://registry.npmjs.org/depd/-/depd-1.1.1.tgz",
|
||||
"integrity": "sha1-V4O04cRZ8G+lyif5kfPQbnoxA1k="
|
||||
},
|
||||
"escape-string-regexp": {
|
||||
"version": "1.0.5",
|
||||
"resolved": "https://registry.npmjs.org/escape-string-regexp/-/escape-string-regexp-1.0.5.tgz",
|
||||
"integrity": "sha1-G2HAViGQqN/2rjuyzwIAyhMLhtQ="
|
||||
},
|
||||
"has-flag": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz",
|
||||
"integrity": "sha1-tdRU3CGZriJWmfNGfloH87lVuv0="
|
||||
},
|
||||
"http-errors": {
|
||||
"version": "1.6.2",
|
||||
"resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.6.2.tgz",
|
||||
"integrity": "sha1-CgAsyFcHGSp+eUbO7cERVfYOxzY=",
|
||||
"requires": {
|
||||
"depd": "1.1.1",
|
||||
"inherits": "2.0.3",
|
||||
"setprototypeof": "1.0.3",
|
||||
"statuses": ">= 1.3.1 < 2"
|
||||
}
|
||||
},
|
||||
"iconv-lite": {
|
||||
"version": "0.4.19",
|
||||
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.19.tgz",
|
||||
"integrity": "sha512-oTZqweIP51xaGPI4uPa56/Pri/480R+mo7SeU+YETByQNhDG55ycFyNLIgta9vXhILrxXDmF7ZGhqZIcuN0gJQ=="
|
||||
},
|
||||
"inherits": {
|
||||
"version": "2.0.3",
|
||||
"resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz",
|
||||
"integrity": "sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4="
|
||||
},
|
||||
"is-stream": {
|
||||
"version": "1.1.0",
|
||||
"resolved": "https://registry.npmjs.org/is-stream/-/is-stream-1.1.0.tgz",
|
||||
"integrity": "sha1-EtSj3U5o4Lec6428hBc66A2RykQ="
|
||||
},
|
||||
"micro": {
|
||||
"version": "9.3.3",
|
||||
"resolved": "https://registry.npmjs.org/micro/-/micro-9.3.3.tgz",
|
||||
"integrity": "sha512-GbCp4NFQguARch0odX+BuWDja2Kc1pbYZqWfRvEDihGFTJG8U77C0L+Owg2j7TPyhQ5Tc+7z/SxspRqjdiZCjQ==",
|
||||
"requires": {
|
||||
"arg": "2.0.0",
|
||||
"chalk": "2.4.0",
|
||||
"content-type": "1.0.4",
|
||||
"is-stream": "1.1.0",
|
||||
"raw-body": "2.3.2"
|
||||
}
|
||||
},
|
||||
"node-fetch": {
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-2.2.0.tgz",
|
||||
"integrity": "sha512-OayFWziIxiHY8bCUyLX6sTpDH8Jsbp4FfYd1j1f7vZyfgkcOnAyM4oQR16f8a0s7Gl/viMGRey8eScYk4V4EZA=="
|
||||
},
|
||||
"raw-body": {
|
||||
"version": "2.3.2",
|
||||
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.3.2.tgz",
|
||||
"integrity": "sha1-vNYMd9Prk83gBQKVw/N5OJvIj4k=",
|
||||
"requires": {
|
||||
"bytes": "3.0.0",
|
||||
"http-errors": "1.6.2",
|
||||
"iconv-lite": "0.4.19",
|
||||
"unpipe": "1.0.0"
|
||||
}
|
||||
},
|
||||
"setprototypeof": {
|
||||
"version": "1.0.3",
|
||||
"resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.0.3.tgz",
|
||||
"integrity": "sha1-ZlZ+NwQ+608E2RvWWMDL77VbjgQ="
|
||||
},
|
||||
"statuses": {
|
||||
"version": "1.5.0",
|
||||
"resolved": "https://registry.npmjs.org/statuses/-/statuses-1.5.0.tgz",
|
||||
"integrity": "sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow="
|
||||
},
|
||||
"supports-color": {
|
||||
"version": "5.5.0",
|
||||
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz",
|
||||
"integrity": "sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow==",
|
||||
"requires": {
|
||||
"has-flag": "^3.0.0"
|
||||
}
|
||||
},
|
||||
"unpipe": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz",
|
||||
"integrity": "sha1-sr9O6FFKrmFltIF4KdIbLvSZBOw="
|
||||
}
|
||||
}
|
||||
}
|
@ -0,0 +1,8 @@
|
||||
{
|
||||
"name": "hge-events-zeit-node-echo",
|
||||
"dependencies": {
|
||||
"micro": "9.3.3"
|
||||
},
|
||||
"license": "MIT",
|
||||
"main": "./index.js"
|
||||
}
|
@ -0,0 +1,4 @@
|
||||
*
|
||||
!package.json
|
||||
!package-lock.json
|
||||
!index.js
|
@ -0,0 +1,5 @@
|
||||
FROM node:10-alpine
|
||||
COPY package.json package-lock.json ./
|
||||
RUN npm ci
|
||||
COPY index.js .
|
||||
CMD ["node", "node_modules/.bin/micro"]
|
@ -0,0 +1,30 @@
|
||||
# Setup our tables
|
||||
|
||||
Create table using the console:
|
||||
|
||||
```
|
||||
Table name: notes
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
|
||||
Table name: note_revision
|
||||
|
||||
Columns:
|
||||
id: Integer auto-increment
|
||||
note: Text
|
||||
note_id: Integer (foreign key to notes.id)
|
||||
update_at: Timestamp, default `now()`
|
||||
```
|
||||
|
||||
# Setup echo serverless function
|
||||
1. cd echo
|
||||
2. edit now.json to change app and alias name (Note: alias will be the domain for example setting alias to hge-events-zeit-node-echo will provide hge-events-zeit-node-echo.now.sh)
|
||||
3. Deploy the function by running `now && now alias && now remove <app-name> --safe -y`
|
||||
|
||||
# Add events in Hasura GraphQL
|
||||
|
||||
1. In events tab, add a trigger
|
||||
2. Select all insert, update, delete operations for the trigger.
|
||||
3. Paste the API endpoint as the webhook.
|
@ -0,0 +1,43 @@
|
||||
const {json, send} = require('micro');
|
||||
const { query } = require('graphqurl');
|
||||
|
||||
const accessKey = process.env.ACCESS_KEY;
|
||||
const hgeEndpoint = process.env.HGE_ENDPOINT + '/v1alpha1/graphql';
|
||||
|
||||
const q = `
|
||||
mutation updateNoteRevision ($noteId: Int!, $data: String!) {
|
||||
insert_note_revisions (objects: [
|
||||
{
|
||||
note_id: $noteId,
|
||||
updated_note: $data
|
||||
}
|
||||
]) {
|
||||
affected_rows
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
module.exports = async (req, res) => {
|
||||
let js;
|
||||
try {
|
||||
js = await json(req);
|
||||
} catch (err) {
|
||||
send(res, 400, {'error': err.message});
|
||||
}
|
||||
query(
|
||||
{
|
||||
query: q,
|
||||
endpoint: hgeEndpoint,
|
||||
variables: {noteId: js.event.data.old.id, data: js.event.data.old.note},
|
||||
headers: {
|
||||
'x-hasura-access-key': accessKey,
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
}
|
||||
).then((response) => {
|
||||
console.log(response);
|
||||
send(res, 200, {'message': response});
|
||||
}).catch((error) => {
|
||||
send(res, 400, {'message': error});
|
||||
});
|
||||
};
|
@ -0,0 +1,9 @@
|
||||
{
|
||||
"public": true,
|
||||
"type": "docker",
|
||||
"features": {
|
||||
"cloud": "v2"
|
||||
},
|
||||
"alias": "hge-events-zeit-node-mutation",
|
||||
"name": "hge-events-zeit-node-mutation"
|
||||
}
|
1369
community/boilerplates/serverless-triggers/zeit-serverless-docker/nodejs/mutation/package-lock.json
generated
Normal file
1369
community/boilerplates/serverless-triggers/zeit-serverless-docker/nodejs/mutation/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
@ -0,0 +1,9 @@
|
||||
{
|
||||
"name": "hge-events-zeit-node-echo",
|
||||
"dependencies": {
|
||||
"graphqurl": "^0.3.1",
|
||||
"micro": "9.3.3"
|
||||
},
|
||||
"license": "MIT",
|
||||
"main": "./index.js"
|
||||
}
|
21
community/examples/realtime-chat/.gitignore
vendored
Normal file
21
community/examples/realtime-chat/.gitignore
vendored
Normal file
@ -0,0 +1,21 @@
|
||||
# See https://help.github.com/ignore-files/ for more about ignoring files.
|
||||
|
||||
# dependencies
|
||||
/node_modules
|
||||
|
||||
# testing
|
||||
/coverage
|
||||
|
||||
# production
|
||||
/build
|
||||
|
||||
# misc
|
||||
.DS_Store
|
||||
.env.local
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
9
community/examples/realtime-chat/README.md
Normal file
9
community/examples/realtime-chat/README.md
Normal file
@ -0,0 +1,9 @@
|
||||
# Realtime Chat using GraphQL Subscriptions
|
||||
|
||||
This is the source code for a fully working group chat app that uses subscriptions in Hasura GraphQL Engine. It is built using React and Apollo.
|
||||
|
||||
- [Fully working app](https://chat-example-trial-roar.herokuapp.com)
|
||||
- [Backend](https://hasura-realtime-group-chat.herokuapp.com)
|
||||
|
||||
For complete tutorial about data modelling, [check out this blog](https://medium.com/@rishichandrawawhal/building-a-realtime-chat-app-with-graphql-subscriptions-d68cd33e73f).
|
||||
|
33
community/examples/realtime-chat/package.json
Normal file
33
community/examples/realtime-chat/package.json
Normal file
@ -0,0 +1,33 @@
|
||||
{
|
||||
"name": "react-graphql-boilerplate",
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"dependencies": {
|
||||
"apollo-boost": "^0.1.10",
|
||||
"apollo-client-preset": "^1.0.8",
|
||||
"apollo-link-schema": "^1.1.0",
|
||||
"apollo-link-ws": "^1.0.8",
|
||||
"apollo-utilities": "^1.0.16",
|
||||
"graphql": "^0.13.2",
|
||||
"graphql-tag": "^2.9.2",
|
||||
"graphql-tools": "^3.0.5",
|
||||
"moment": "^2.22.2",
|
||||
"react": "^16.4.1",
|
||||
"react-apollo": "^2.1.9",
|
||||
"react-dom": "^16.4.1",
|
||||
"react-router-dom": "^4.3.1",
|
||||
"react-scripts": "^1.1.1",
|
||||
"subscriptions-transport-ws": "^0.9.12"
|
||||
},
|
||||
"scripts": {
|
||||
"start": "react-scripts start",
|
||||
"build": "react-scripts build",
|
||||
"test": "react-scripts test --env=jsdom",
|
||||
"eject": "react-scripts eject"
|
||||
},
|
||||
"devDependencies": {
|
||||
"eslint-plugin-graphql": "^1.5.0",
|
||||
"eslint": "^5.1.0",
|
||||
"eslint-plugin-react": "^7.10.0"
|
||||
}
|
||||
}
|
BIN
community/examples/realtime-chat/public/favicon.ico
Normal file
BIN
community/examples/realtime-chat/public/favicon.ico
Normal file
Binary file not shown.
After Width: | Height: | Size: 3.8 KiB |
40
community/examples/realtime-chat/public/index.html
Normal file
40
community/examples/realtime-chat/public/index.html
Normal file
@ -0,0 +1,40 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||||
<meta name="theme-color" content="#000000">
|
||||
<!--
|
||||
manifest.json provides metadata used when your web app is added to the
|
||||
homescreen on Android. See https://developers.google.com/web/fundamentals/engage-and-retain/web-app-manifest/
|
||||
-->
|
||||
<link rel="manifest" href="%PUBLIC_URL%/manifest.json">
|
||||
<link rel="shortcut icon" href="%PUBLIC_URL%/favicon.ico">
|
||||
<!--
|
||||
Notice the use of %PUBLIC_URL% in the tags above.
|
||||
It will be replaced with the URL of the `public` folder during the build.
|
||||
Only files inside the `public` folder can be referenced from the HTML.
|
||||
|
||||
Unlike "/favicon.ico" or "favicon.ico", "%PUBLIC_URL%/favicon.ico" will
|
||||
work correctly both with client-side routing and a non-root public URL.
|
||||
Learn how to configure a non-root public URL by running `npm run build`.
|
||||
-->
|
||||
<title>React App</title>
|
||||
</head>
|
||||
<body>
|
||||
<noscript>
|
||||
You need to enable JavaScript to run this app.
|
||||
</noscript>
|
||||
<div id="root"></div>
|
||||
<!--
|
||||
This HTML file is a template.
|
||||
If you open it directly in the browser, you will see an empty page.
|
||||
|
||||
You can add webfonts, meta tags, or analytics to this file.
|
||||
The build step will place the bundled scripts into the <body> tag.
|
||||
|
||||
To begin the development, run `npm start` or `yarn start`.
|
||||
To create a production bundle, use `npm run build` or `yarn build`.
|
||||
-->
|
||||
</body>
|
||||
</html>
|
15
community/examples/realtime-chat/public/manifest.json
Normal file
15
community/examples/realtime-chat/public/manifest.json
Normal file
@ -0,0 +1,15 @@
|
||||
{
|
||||
"short_name": "React App",
|
||||
"name": "Create React App Sample",
|
||||
"icons": [
|
||||
{
|
||||
"src": "favicon.ico",
|
||||
"sizes": "64x64 32x32 24x24 16x16",
|
||||
"type": "image/x-icon"
|
||||
}
|
||||
],
|
||||
"start_url": "./index.html",
|
||||
"display": "standalone",
|
||||
"theme_color": "#000000",
|
||||
"background_color": "#ffffff"
|
||||
}
|
265
community/examples/realtime-chat/src/App.css
Normal file
265
community/examples/realtime-chat/src/App.css
Normal file
@ -0,0 +1,265 @@
|
||||
@import url('https://fonts.googleapis.com/css?family=Raleway:400,600');
|
||||
@import url('https://fonts.googleapis.com/css?family=Open+Sans:300,400');
|
||||
body
|
||||
{
|
||||
font-family: 'Open Sans';
|
||||
font-size: 16px;
|
||||
margin: 0;
|
||||
}
|
||||
.app {
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
.messageOdd {
|
||||
font-size: 18px;
|
||||
padding-left: 5px;
|
||||
}
|
||||
|
||||
.message {
|
||||
font-size: 16px;
|
||||
background-color: #fff;
|
||||
padding-left: 5px;
|
||||
margin: 20px;
|
||||
border-radius: 5px;
|
||||
width: 50%;
|
||||
padding: 5px;
|
||||
}
|
||||
|
||||
.selfMessage {
|
||||
font-size: 16px;
|
||||
background-color: #eee;
|
||||
padding-left: 5px;
|
||||
margin: 20px;
|
||||
border-radius: 5px;
|
||||
width: 50%;
|
||||
padding: 5px;
|
||||
float: right;
|
||||
}
|
||||
|
||||
.newMessageEven {
|
||||
|
||||
font-size: 18px;
|
||||
background-color: #98FB98;
|
||||
padding-left: 5px;
|
||||
}
|
||||
|
||||
.newMessageOdd {
|
||||
font-size: 18px;
|
||||
background-color: #8FBC8F;
|
||||
padding-left: 5px;
|
||||
}
|
||||
|
||||
.messageWrapperNew
|
||||
{
|
||||
padding-bottom: 75px;
|
||||
}
|
||||
|
||||
.banner {
|
||||
position: -webkit-sticky;
|
||||
position: sticky;
|
||||
top: 0;
|
||||
align-self: flex-start;
|
||||
background-color: #20c40f;
|
||||
font-size: 18px;
|
||||
cursor: pointer;
|
||||
font-weight: 400;
|
||||
padding: 10px 0;
|
||||
text-align: center;
|
||||
font-family: 'raleway';
|
||||
color: #fff;
|
||||
}
|
||||
|
||||
.oldNewSeparator {
|
||||
font-size: 18px;
|
||||
text-align: center;
|
||||
margin-bottom: 15px;
|
||||
}
|
||||
|
||||
#chatbox {
|
||||
overflow: auto;
|
||||
height: 100vh;
|
||||
background-color: #f8f9f9;
|
||||
}
|
||||
|
||||
.textboxWrapper {
|
||||
text-align: center;
|
||||
position: fixed;
|
||||
bottom: 0;
|
||||
width: 75%;
|
||||
background-color: #fff;
|
||||
padding: 1%;
|
||||
}
|
||||
|
||||
.textbox {
|
||||
|
||||
}
|
||||
.sendButton {
|
||||
width: 20%;
|
||||
background-color:'green';
|
||||
}
|
||||
|
||||
.login {
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.loginTextbox {
|
||||
font-size: 16px;
|
||||
height: 43px;
|
||||
width: 74%;
|
||||
margin-right: 1%;
|
||||
font-weight: 300;
|
||||
border: 1px solid #ececec;
|
||||
border-radius: 5px;
|
||||
padding: 0;
|
||||
padding-left: 10px;
|
||||
display: inline-block;
|
||||
|
||||
}
|
||||
.typoTextbox
|
||||
{
|
||||
font-size: 16px;
|
||||
height: 43px;
|
||||
width: 75%;
|
||||
margin-right: 1%;
|
||||
font-weight: 300;
|
||||
border: 1px solid #ececec;
|
||||
border-radius: 5px;
|
||||
padding: 0;
|
||||
padding-left: 10px;
|
||||
display: inline-block;
|
||||
background-color: #f6f6f7;
|
||||
}
|
||||
.loginTextbox:focus, .typoTextbox:focus
|
||||
{
|
||||
outline: none;
|
||||
border-color: #016d95;
|
||||
}
|
||||
.typoTextbox:focus
|
||||
{
|
||||
outline: none;
|
||||
border-color: #bbbdbd;
|
||||
}
|
||||
|
||||
.loginButton {
|
||||
height: 45px;
|
||||
width: 21%;
|
||||
display: inline-block;
|
||||
border-radius: 5px;
|
||||
background-color: 'green' !important;
|
||||
cursor: pointer;
|
||||
font-size: 16px;
|
||||
margin-right: 1%;
|
||||
}
|
||||
.typoButton
|
||||
{
|
||||
height: 45px;
|
||||
width: 20%;
|
||||
display: inline-block;
|
||||
border-radius: 5px;
|
||||
background-color: #ffca27;
|
||||
cursor: pointer;
|
||||
font-size: 16px;
|
||||
margin-right: 1%;
|
||||
border: 0;
|
||||
color: #222;
|
||||
}
|
||||
.typoButton:hover
|
||||
{
|
||||
background-color: #dba203;
|
||||
}
|
||||
.loginButton:focus, .typoButton:focus
|
||||
{
|
||||
outline: none;
|
||||
}
|
||||
.loginButton:hover
|
||||
{
|
||||
background-color: #f6f6f6;
|
||||
}
|
||||
.loginHeading {
|
||||
text-align: center;
|
||||
font-family: 'Raleway';
|
||||
margin-top: 0;
|
||||
padding-bottom: 20px;
|
||||
}
|
||||
.loginWrapper
|
||||
{
|
||||
width: 450px;
|
||||
padding: 30px;
|
||||
margin: 0 auto;
|
||||
position: fixed;
|
||||
left: 50%;
|
||||
top: 50%;
|
||||
transform: translate(-50%, -50%);
|
||||
border: 1px solid #ececec;
|
||||
border-radius: 5px;
|
||||
background-color: #f6f6f6;
|
||||
}
|
||||
|
||||
.errorMessage {
|
||||
text-align: center;
|
||||
color: 'red';
|
||||
padding-bottom: 10px;
|
||||
}
|
||||
|
||||
.wd25 {
|
||||
width: 25%;
|
||||
display: inline-block;
|
||||
}
|
||||
|
||||
.wd75 {
|
||||
width: 75%;
|
||||
display: inline-block;
|
||||
}
|
||||
|
||||
.onlineUsers {
|
||||
background-color: #4f5050;
|
||||
height: 100vh;
|
||||
overflow: auto;
|
||||
}
|
||||
.messageName, .messsageTime
|
||||
{
|
||||
width: 49%;
|
||||
display: inline-block;
|
||||
}
|
||||
.messageName
|
||||
{
|
||||
color: #1d5d01;
|
||||
}
|
||||
.messsageTime
|
||||
{
|
||||
text-align: right;
|
||||
padding-right: 5px;
|
||||
font-size: 12px;
|
||||
color: #01999b;
|
||||
}
|
||||
.userList {
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
-webkit-padding-start: 0px;
|
||||
}
|
||||
|
||||
.userList li {
|
||||
padding: 10px;
|
||||
border-bottom: 1px solid #444;
|
||||
color: #fff;
|
||||
}
|
||||
|
||||
.chatWrapper {
|
||||
display: flex;
|
||||
height: 100vh;
|
||||
}
|
||||
|
||||
.userListHeading {
|
||||
font-weight: 600;
|
||||
padding: 15px 10px;
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
background-color: #222;
|
||||
color: #fff;
|
||||
}
|
||||
|
||||
.typingIndicator {
|
||||
text-align: left;
|
||||
padding-bottom: 10px;
|
||||
padding-left: 1%;
|
||||
}
|
9
community/examples/realtime-chat/src/App.js
Normal file
9
community/examples/realtime-chat/src/App.js
Normal file
@ -0,0 +1,9 @@
|
||||
import React from 'react';
|
||||
import Main from './components/Main';
|
||||
import './App.css' ;
|
||||
|
||||
const App = () => {
|
||||
return <div className="app"> <Main/> </div>;
|
||||
}
|
||||
|
||||
export default App;
|
9
community/examples/realtime-chat/src/App.test.js
Normal file
9
community/examples/realtime-chat/src/App.test.js
Normal file
@ -0,0 +1,9 @@
|
||||
import React from 'react';
|
||||
import ReactDOM from 'react-dom';
|
||||
import App from './App';
|
||||
|
||||
it('renders without crashing', () => {
|
||||
const div = document.createElement('div');
|
||||
ReactDOM.render(<App />, div);
|
||||
ReactDOM.unmountComponentAtNode(div);
|
||||
});
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user