Datasets implementation for dataconnectors

PR-URL: https://github.com/hasura/graphql-engine-mono/pull/7502
GitOrigin-RevId: 7ca9a16aa2b27f4efb1263c6415e5a718ca8ced8
This commit is contained in:
Lyndon Maydwell 2023-01-17 15:47:40 +10:00 committed by hasura-bot
parent bfdeaf0334
commit 8d6b9f70f1
38 changed files with 714 additions and 34 deletions

View File

@ -258,6 +258,20 @@ The preference would be to support the highest level of atomicity possible (ie `
The agent can also specify whether or not it supports `returning` data from mutations. This refers to the ability to return the data that was mutated by mutation operations (for example, the updated rows in an update, or the deleted rows in a delete). The agent can also specify whether or not it supports `returning` data from mutations. This refers to the ability to return the data that was mutated by mutation operations (for example, the updated rows in an update, or the deleted rows in a delete).
### Dataset Capabilities
The agent can declare whether it supports datasets (ie. api for creating/cloning schemas). If it supports datasets, it needs to declare a `datasets` capability:
```json
{
"capabilities": {
"datasets": { }
}
}
```
See [Datasets](#datasets) for information on how this capability is used.
### Schema ### Schema
The `GET /schema` endpoint is called whenever the metadata is (re)loaded by `graphql-engine`. It returns the following JSON object: The `GET /schema` endpoint is called whenever the metadata is (re)loaded by `graphql-engine`. It returns the following JSON object:
@ -2160,3 +2174,42 @@ Breaking down the properties in the `delete`-typed mutation operation:
* `returning_fields`: This specifies a list of fields to return in the response. The property takes the same format as the `fields` property on Queries. It is expected that the specified fields will be returned for all rows affected by the deletion (ie. all deleted rows). * `returning_fields`: This specifies a list of fields to return in the response. The property takes the same format as the `fields` property on Queries. It is expected that the specified fields will be returned for all rows affected by the deletion (ie. all deleted rows).
Delete operations return responses that are the same as insert and update operations, except the affected rows in `returning` are the deleted rows instead. Delete operations return responses that are the same as insert and update operations, except the affected rows in `returning` are the deleted rows instead.
### Datasets
The `/datasets` resource is available to use in order to create new databases/schemas from templates.
Datasets are represented by abstract names referencing database-schema templates that can be cloned from and clones that can be used via config and deleted. This feature is required for testing the mutations feature, but may also have non-test uses - for example - spinning up interactive demo projects.
The `/datasets/:name` resource has the following methods:
* `GET /datasets/:template_name` -> `{"exists": true|false}`
* `POST /datasets/:clone_name {"from": template_name}` -> `{"config": {...}}`
* `DELETE /datasets/:clone_name` -> `{"message": "success"}`
The `POST` method is the most significant way to interact with the API. It allows for cloning a dataset template to a new name. The new name can be used to delete the dataset, and the config returned from the POST API call can be used as the config header for non-dataset interactions such as queries.
The following diagram shows the interactions between the various datatypes and resource methods:
```mermaid
flowchart TD;
NAME["Dataset Name"] --> GET["GET /datasets/templates/:template_name"];
style NAME stroke:#0f3,stroke-width:2px
NAME -- clone_name --> POST;
NAME -- from --> POST["POST /datasets/clones/:clone_name { from: TEMPLATE_NAME }"];
GET --> EXISTS["{ exists: true }"];
GET --> EXISTSF["{ exists: false }"];
GET --> FAILUREG["400"];
style FAILUREG stroke:#f33,stroke-width:2px
POST --> FAILUREP["400"];
style FAILUREP stroke:#f33,stroke-width:2px
NAME --> DELETE["DELETE /datasets/clones/:clone_name"];
POST --> CONFIG["Source Config"];
style CONFIG stroke:#0f3,stroke-width:2px
DELETE --> SUCCESSD["{ message: 'success' }"];
DELETE --> FAILURED["400"];
style FAILURED stroke:#f33,stroke-width:2px
CONFIG --> SCHEMA["POST /schema"];
CONFIG --> QUERY["POST /query"];
CONFIG --> MUTATION["POST /mutation"];
```

View File

@ -1,6 +1,6 @@
{ {
"name": "@hasura/dc-api-types", "name": "@hasura/dc-api-types",
"version": "0.21.0", "version": "0.22.0",
"description": "Hasura GraphQL Engine Data Connector Agent API types", "description": "Hasura GraphQL Engine Data Connector Agent API types",
"author": "Hasura (https://github.com/hasura/graphql-engine)", "author": "Hasura (https://github.com/hasura/graphql-engine)",
"license": "Apache-2.0", "license": "Apache-2.0",

View File

@ -370,6 +370,103 @@
} }
} }
} }
},
"/datasets/templates/{template_name}": {
"get": {
"parameters": [
{
"in": "path",
"name": "template_name",
"required": true,
"schema": {
"type": "string"
}
}
],
"responses": {
"200": {
"content": {
"application/json;charset=utf-8": {
"schema": {
"$ref": "#/components/schemas/DatasetGetResponse"
}
}
},
"description": ""
},
"404": {
"description": "`template_name` not found"
}
}
}
},
"/datasets/clones/{clone_name}": {
"post": {
"parameters": [
{
"in": "path",
"name": "clone_name",
"required": true,
"schema": {
"type": "string"
}
}
],
"requestBody": {
"content": {
"application/json;charset=utf-8": {
"schema": {
"$ref": "#/components/schemas/DatasetPostRequest"
}
}
}
},
"responses": {
"200": {
"content": {
"application/json;charset=utf-8": {
"schema": {
"$ref": "#/components/schemas/DatasetPostResponse"
}
}
},
"description": ""
},
"400": {
"description": "Invalid `body`"
},
"404": {
"description": "`clone_name` not found"
}
}
},
"delete": {
"parameters": [
{
"in": "path",
"name": "clone_name",
"required": true,
"schema": {
"type": "string"
}
}
],
"responses": {
"200": {
"content": {
"application/json;charset=utf-8": {
"schema": {
"$ref": "#/components/schemas/DatasetDeleteResponse"
}
}
},
"description": ""
},
"404": {
"description": "`clone_name` not found"
}
}
}
} }
}, },
"components": { "components": {
@ -404,6 +501,9 @@
"data_schema": { "data_schema": {
"$ref": "#/components/schemas/DataSchemaCapabilities" "$ref": "#/components/schemas/DataSchemaCapabilities"
}, },
"datasets": {
"$ref": "#/components/schemas/DatasetCapabilities"
},
"explain": { "explain": {
"$ref": "#/components/schemas/ExplainCapabilities" "$ref": "#/components/schemas/ExplainCapabilities"
}, },
@ -573,6 +673,7 @@
"MetricsCapabilities": {}, "MetricsCapabilities": {},
"ExplainCapabilities": {}, "ExplainCapabilities": {},
"RawCapabilities": {}, "RawCapabilities": {},
"DatasetCapabilities": {},
"ConfigSchemaResponse": { "ConfigSchemaResponse": {
"nullable": false, "nullable": false,
"properties": { "properties": {
@ -2350,6 +2451,61 @@
"query" "query"
], ],
"type": "object" "type": "object"
},
"DatasetGetResponse": {
"properties": {
"exists": {
"description": "Message detailing if the dataset exists",
"type": "boolean"
}
},
"required": [
"exists"
],
"type": "object"
},
"DatasetPostResponse": {
"properties": {
"config": {
"$ref": "#/components/schemas/Config"
}
},
"required": [
"config"
],
"type": "object"
},
"Config": {
"additionalProperties": {
"additionalProperties": true
},
"type": "object"
},
"DatasetPostRequest": {
"properties": {
"from": {
"$ref": "#/components/schemas/DatasetTemplateName"
}
},
"required": [
"from"
],
"type": "object"
},
"DatasetTemplateName": {
"type": "string"
},
"DatasetDeleteResponse": {
"properties": {
"message": {
"description": "The named dataset to clone from",
"type": "string"
}
},
"required": [
"message"
],
"type": "object"
} }
} }
} }

View File

@ -27,9 +27,16 @@ export type { ComparisonCapabilities } from './models/ComparisonCapabilities';
export type { ComparisonColumn } from './models/ComparisonColumn'; export type { ComparisonColumn } from './models/ComparisonColumn';
export type { ComparisonOperators } from './models/ComparisonOperators'; export type { ComparisonOperators } from './models/ComparisonOperators';
export type { ComparisonValue } from './models/ComparisonValue'; export type { ComparisonValue } from './models/ComparisonValue';
export type { Config } from './models/Config';
export type { ConfigSchemaResponse } from './models/ConfigSchemaResponse'; export type { ConfigSchemaResponse } from './models/ConfigSchemaResponse';
export type { Constraint } from './models/Constraint'; export type { Constraint } from './models/Constraint';
export type { DataSchemaCapabilities } from './models/DataSchemaCapabilities'; export type { DataSchemaCapabilities } from './models/DataSchemaCapabilities';
export type { DatasetCapabilities } from './models/DatasetCapabilities';
export type { DatasetDeleteResponse } from './models/DatasetDeleteResponse';
export type { DatasetGetResponse } from './models/DatasetGetResponse';
export type { DatasetPostRequest } from './models/DatasetPostRequest';
export type { DatasetPostResponse } from './models/DatasetPostResponse';
export type { DatasetTemplateName } from './models/DatasetTemplateName';
export type { DeleteCapabilities } from './models/DeleteCapabilities'; export type { DeleteCapabilities } from './models/DeleteCapabilities';
export type { DeleteMutationOperation } from './models/DeleteMutationOperation'; export type { DeleteMutationOperation } from './models/DeleteMutationOperation';
export type { ErrorResponse } from './models/ErrorResponse'; export type { ErrorResponse } from './models/ErrorResponse';

View File

@ -4,6 +4,7 @@
import type { ComparisonCapabilities } from './ComparisonCapabilities'; import type { ComparisonCapabilities } from './ComparisonCapabilities';
import type { DataSchemaCapabilities } from './DataSchemaCapabilities'; import type { DataSchemaCapabilities } from './DataSchemaCapabilities';
import type { DatasetCapabilities } from './DatasetCapabilities';
import type { ExplainCapabilities } from './ExplainCapabilities'; import type { ExplainCapabilities } from './ExplainCapabilities';
import type { MetricsCapabilities } from './MetricsCapabilities'; import type { MetricsCapabilities } from './MetricsCapabilities';
import type { MutationCapabilities } from './MutationCapabilities'; import type { MutationCapabilities } from './MutationCapabilities';
@ -16,6 +17,7 @@ import type { SubscriptionCapabilities } from './SubscriptionCapabilities';
export type Capabilities = { export type Capabilities = {
comparisons?: ComparisonCapabilities; comparisons?: ComparisonCapabilities;
data_schema?: DataSchemaCapabilities; data_schema?: DataSchemaCapabilities;
datasets?: DatasetCapabilities;
explain?: ExplainCapabilities; explain?: ExplainCapabilities;
metrics?: MetricsCapabilities; metrics?: MetricsCapabilities;
mutations?: MutationCapabilities; mutations?: MutationCapabilities;

View File

@ -0,0 +1,5 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type Config = Record<string, any>;

View File

@ -0,0 +1,7 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type DatasetCapabilities = {
};

View File

@ -0,0 +1,11 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type DatasetDeleteResponse = {
/**
* The named dataset to clone from
*/
message: string;
};

View File

@ -0,0 +1,11 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type DatasetGetResponse = {
/**
* Message detailing if the dataset exists
*/
exists: boolean;
};

View File

@ -0,0 +1,10 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { DatasetTemplateName } from './DatasetTemplateName';
export type DatasetPostRequest = {
from: DatasetTemplateName;
};

View File

@ -0,0 +1,10 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
import type { Config } from './Config';
export type DatasetPostResponse = {
config: Config;
};

View File

@ -0,0 +1,5 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type DatasetTemplateName = string;

View File

@ -24,7 +24,7 @@
}, },
"dc-api-types": { "dc-api-types": {
"name": "@hasura/dc-api-types", "name": "@hasura/dc-api-types",
"version": "0.21.0", "version": "0.22.0",
"license": "Apache-2.0", "license": "Apache-2.0",
"devDependencies": { "devDependencies": {
"@tsconfig/node16": "^1.0.3", "@tsconfig/node16": "^1.0.3",
@ -1197,7 +1197,7 @@
"license": "Apache-2.0", "license": "Apache-2.0",
"dependencies": { "dependencies": {
"@fastify/cors": "^7.0.0", "@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"fastify": "^3.29.0", "fastify": "^3.29.0",
"mathjs": "^11.0.0", "mathjs": "^11.0.0",
"pino-pretty": "^8.0.0", "pino-pretty": "^8.0.0",
@ -1781,7 +1781,7 @@
"license": "Apache-2.0", "license": "Apache-2.0",
"dependencies": { "dependencies": {
"@fastify/cors": "^8.1.0", "@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"fastify": "^4.4.0", "fastify": "^4.4.0",
"fastify-metrics": "^9.2.1", "fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4", "nanoid": "^3.3.4",
@ -3125,7 +3125,7 @@
"version": "file:reference", "version": "file:reference",
"requires": { "requires": {
"@fastify/cors": "^7.0.0", "@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"@tsconfig/node16": "^1.0.3", "@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49", "@types/node": "^16.11.49",
"@types/xml2js": "^0.4.11", "@types/xml2js": "^0.4.11",
@ -3514,7 +3514,7 @@
"version": "file:sqlite", "version": "file:sqlite",
"requires": { "requires": {
"@fastify/cors": "^8.1.0", "@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"@tsconfig/node16": "^1.0.3", "@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49", "@types/node": "^16.11.49",
"@types/sqlite3": "^3.1.8", "@types/sqlite3": "^3.1.8",

View File

@ -24,6 +24,8 @@ More specifically, the `ChinookData.xml.gz` file is a GZipped version of https:/
The `schema-tables.json` is manually derived from the schema of the data as can be seen from the `CREATE TABLE` etc DML statements in the various per-database-vendor SQL scripts that can be found in `/ChinookDatabase/DataSources` in that repo. The `schema-tables.json` is manually derived from the schema of the data as can be seen from the `CREATE TABLE` etc DML statements in the various per-database-vendor SQL scripts that can be found in `/ChinookDatabase/DataSources` in that repo.
The datasets can be operated on via the `/datasets` resources as described in `dc-agents/README.md`.
## Configuration ## Configuration
The reference agent supports some configuration properties that can be set via the `value` property of `configuration` on a source in Hasura metadata. The configuration is passed to the agent on each request via the `X-Hasura-DataConnector-Config` header. The reference agent supports some configuration properties that can be set via the `value` property of `configuration` on a source in Hasura metadata. The configuration is passed to the agent on each request via the `X-Hasura-DataConnector-Config` header.

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0", "license": "Apache-2.0",
"dependencies": { "dependencies": {
"@fastify/cors": "^7.0.0", "@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"fastify": "^3.29.0", "fastify": "^3.29.0",
"mathjs": "^11.0.0", "mathjs": "^11.0.0",
"pino-pretty": "^8.0.0", "pino-pretty": "^8.0.0",
@ -44,7 +44,7 @@
} }
}, },
"node_modules/@hasura/dc-api-types": { "node_modules/@hasura/dc-api-types": {
"version": "0.21.0", "version": "0.22.0",
"license": "Apache-2.0", "license": "Apache-2.0",
"devDependencies": { "devDependencies": {
"@tsconfig/node16": "^1.0.3", "@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
}, },
"dependencies": { "dependencies": {
"@fastify/cors": "^7.0.0", "@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"fastify": "^3.29.0", "fastify": "^3.29.0",
"mathjs": "^11.0.0", "mathjs": "^11.0.0",
"pino-pretty": "^8.0.0", "pino-pretty": "^8.0.0",

View File

@ -57,7 +57,8 @@ const capabilities: Capabilities = {
supports_relations: true supports_relations: true
} }
}, },
scalar_types: scalarTypes scalar_types: scalarTypes,
datasets: {}
} }
export const capabilitiesResponse: CapabilitiesResponse = { export const capabilitiesResponse: CapabilitiesResponse = {

View File

@ -6,6 +6,7 @@ export type Casing = "pascal_case" | "lowercase";
export type Config = { export type Config = {
tables: string[] | null tables: string[] | null
schema: string | null schema: string | null
db: string | null
table_name_casing: Casing table_name_casing: Casing
column_name_casing: Casing column_name_casing: Casing
} }
@ -17,6 +18,7 @@ export const getConfig = (request: FastifyRequest): Config => {
return { return {
tables: config.tables ?? null, tables: config.tables ?? null,
schema: config.schema ?? null, schema: config.schema ?? null,
db: config.db ?? null,
table_name_casing: config.table_name_casing ?? "pascal_case", table_name_casing: config.table_name_casing ?? "pascal_case",
column_name_casing: config.column_name_casing ?? "pascal_case", column_name_casing: config.column_name_casing ?? "pascal_case",
} }
@ -38,6 +40,11 @@ export const configSchema: ConfigSchemaResponse = {
type: "string", type: "string",
nullable: true nullable: true
}, },
db: {
description: "Name of the db. Omit to use the default db.",
type: "string",
nullable: true
},
table_name_casing: { table_name_casing: {
$ref: "#/other_schemas/Casing" $ref: "#/other_schemas/Casing"
}, },

View File

@ -31,8 +31,8 @@ const parseNumbersInNumericColumns = (schema: SchemaResponse) => {
}; };
} }
export const loadStaticData = async (): Promise<StaticData> => { export const loadStaticData = async (name: string): Promise<StaticData> => {
const gzipReadStream = fs.createReadStream(__dirname + "/ChinookData.xml.gz"); const gzipReadStream = fs.createReadStream(__dirname + "/" + name);
const unzipStream = stream.pipeline(gzipReadStream, zlib.createGunzip(), () => { }); const unzipStream = stream.pipeline(gzipReadStream, zlib.createGunzip(), () => { });
const xmlStr = (await streamToBuffer(unzipStream)).toString("utf16le"); const xmlStr = (await streamToBuffer(unzipStream)).toString("utf16le");
const xml = await xml2js.parseStringPromise(xmlStr, { explicitArray: false, emptyTag: () => null, valueProcessors: [parseNumbersInNumericColumns(schema)] }); const xml = await xml2js.parseStringPromise(xmlStr, { explicitArray: false, emptyTag: () => null, valueProcessors: [parseNumbersInNumericColumns(schema)] });

View File

@ -0,0 +1,34 @@
import { DatasetDeleteResponse, DatasetGetResponse, DatasetPostRequest, DatasetPostResponse, } from '@hasura/dc-api-types';
import { loadStaticData, StaticData } from './data';
export async function getDataset(name: string): Promise<DatasetGetResponse> {
const safePath = mkPath(name);
const data = await loadStaticData(safePath); // TODO: Could make this more efficient, but this works for now!
if(data) {
return { exists: true };
} else {
return { exists: false };
}
}
export async function cloneDataset(store: Record<string, StaticData>, name: string, body: DatasetPostRequest): Promise<DatasetPostResponse> {
const safePathName = mkPath(body.from);
const data = await loadStaticData(safePathName);
store[`$${name}`] = data;
return { config: { db: `$${name}` } };
}
export async function deleteDataset(store: Record<string, StaticData>, name: string): Promise<DatasetDeleteResponse> {
const exists = store[`$${name}`];
if(exists) {
delete store[`$${name}`];
return {message: "success"};
} else {
throw(Error("Dataset does not exist."));
}
}
function mkPath(name: string): string {
const base = name.replace(/\//g,''); // TODO: Can this be made safer?
return `${base}.xml.gz`;
}

View File

@ -1,14 +1,15 @@
import Fastify from 'fastify'; import Fastify from 'fastify';
import FastifyCors from '@fastify/cors'; import FastifyCors from '@fastify/cors';
import { filterAvailableTables, getSchema, getTable, loadStaticData } from './data'; import { filterAvailableTables, getSchema, getTable, loadStaticData, StaticData } from './data';
import { queryData } from './query'; import { queryData } from './query';
import { getConfig } from './config'; import { getConfig } from './config';
import { capabilitiesResponse } from './capabilities'; import { capabilitiesResponse } from './capabilities';
import { CapabilitiesResponse, SchemaResponse, QueryRequest, QueryResponse } from '@hasura/dc-api-types'; import { CapabilitiesResponse, SchemaResponse, QueryRequest, QueryResponse, DatasetDeleteResponse, DatasetPostRequest, DatasetGetResponse, DatasetPostResponse } from '@hasura/dc-api-types';
import { cloneDataset, deleteDataset, getDataset } from './datasets';
const port = Number(process.env.PORT) || 8100; const port = Number(process.env.PORT) || 8100;
const server = Fastify({ logger: { prettyPrint: true } }); const server = Fastify({ logger: { prettyPrint: true } });
let staticData = {}; let staticData : Record<string, StaticData> = {};
server.register(FastifyCors, { server.register(FastifyCors, {
// Accept all origins of requests. This must be modified in // Accept all origins of requests. This must be modified in
@ -33,10 +34,40 @@ server.get<{ Reply: SchemaResponse }>("/schema", async (request, _response) => {
server.post<{ Body: QueryRequest, Reply: QueryResponse }>("/query", async (request, _response) => { server.post<{ Body: QueryRequest, Reply: QueryResponse }>("/query", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "query.request"); server.log.info({ headers: request.headers, query: request.body, }, "query.request");
const config = getConfig(request); const config = getConfig(request);
const data = filterAvailableTables(staticData, config); // Prefix '$' to disambiguate from default datasets.
const dbName = config.db ? `$${config.db}` : '@default';
const data = filterAvailableTables(staticData[dbName], config);
return queryData(getTable(data, config), request.body); return queryData(getTable(data, config), request.body);
}); });
// Methods on dataset resources.
//
// Examples:
//
// > curl -H 'content-type: application/json' -XGET localhost:8100/datasets/ChinookData
// {"exists": true}
//
server.get<{ Params: { name: string, }, Reply: DatasetGetResponse }>("/datasets/templates/:name", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "datasets.templates.get");
return getDataset(request.params.name);
});
// > curl -H 'content-type: application/json' -XPOST localhost:8100/datasets/foo -d '{"from": "ChinookData"}'
// {"config":{"db":"$foo"}}
//
server.post<{ Params: { name: string, }, Body: DatasetPostRequest, Reply: DatasetPostResponse }>("/datasets/clones/:name", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "datasets.clones.post");
return cloneDataset(staticData, request.params.name, request.body);
});
// > curl -H 'content-type: application/json' -XDELETE 'localhost:8100/datasets/foo'
// {"message":"success"}
//
server.delete<{ Params: { name: string, }, Reply: DatasetDeleteResponse }>("/datasets/clones/:name", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "datasets.clones.delete");
return deleteDataset(staticData, request.params.name);
});
server.get("/health", async (request, response) => { server.get("/health", async (request, response) => {
server.log.info({ headers: request.headers, query: request.body, }, "health.request"); server.log.info({ headers: request.headers, query: request.body, }, "health.request");
response.statusCode = 204; response.statusCode = 204;
@ -49,7 +80,7 @@ process.on('SIGINT', () => {
const start = async () => { const start = async () => {
try { try {
staticData = await loadStaticData(); staticData = {'@default' : await loadStaticData("ChinookData.xml.gz")};
await server.listen(port, "0.0.0.0"); await server.listen(port, "0.0.0.0");
} }
catch (err) { catch (err) {

View File

@ -1,3 +1,4 @@
node_modules node_modules
dist dist
./*.sqlite ./*.sqlite
./dataset_clones/

View File

@ -68,6 +68,11 @@ Note: Boolean flags `{FLAG}` can be provided as `1`, `true`, `yes`, or omitted a
| `LOG_LEVEL` | `fatal` \| `error` \| `info` \| `debug` \| `trace` \| `silent` | `info` | The minimum log level to output | | `LOG_LEVEL` | `fatal` \| `error` \| `info` \| `debug` \| `trace` \| `silent` | `info` | The minimum log level to output |
| `METRICS` | `{FLAG}` | `false` | Enables a `/metrics` prometheus metrics endpoint. | `METRICS` | `{FLAG}` | `false` | Enables a `/metrics` prometheus metrics endpoint.
| `QUERY_LENGTH_LIMIT` | `INT` | `Infinity` | Puts a limit on the length of generated SQL before execution. | | `QUERY_LENGTH_LIMIT` | `INT` | `Infinity` | Puts a limit on the length of generated SQL before execution. |
| `DATASETS` | `{FLAG}` | `false` | Enable dataset operations |
| `DATASET_DELETE` | `{FLAG}` | `false` | Enable `DELETE /datasets/:name` |
| `DATASET_TEMPLATES` | `DIRECTORY` | `./dataset_templates` | Directory to clone datasets from. |
| `DATASET_CLONES` | `DIRECTORY` | `./dataset_clones` | Directory to clone datasets to. |
## Agent usage ## Agent usage
@ -95,6 +100,19 @@ The dataset used for testing the reference agent is sourced from:
* https://raw.githubusercontent.com/lerocha/chinook-database/master/ChinookDatabase/DataSources/Chinook_Sqlite.sql * https://raw.githubusercontent.com/lerocha/chinook-database/master/ChinookDatabase/DataSources/Chinook_Sqlite.sql
### Datasets
Datasets support is enabled via the ENV variables:
* `DATASETS`
* `DATASET_DELETE`
* `DATASET_TEMPLATES`
* `DATASET_CLONES`
Templates will be looked up at `${DATASET_TEMPLATES}/${template_name}.db`.
Clones will be copied to `${DATASET_CLONES}/${clone_name}.db`.
## Testing Changes to the Agent ## Testing Changes to the Agent
Run: Run:

View File

@ -10,7 +10,7 @@
"license": "Apache-2.0", "license": "Apache-2.0",
"dependencies": { "dependencies": {
"@fastify/cors": "^8.1.0", "@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"fastify": "^4.4.0", "fastify": "^4.4.0",
"fastify-metrics": "^9.2.1", "fastify-metrics": "^9.2.1",
"nanoid": "^3.3.4", "nanoid": "^3.3.4",
@ -54,7 +54,7 @@
"license": "MIT" "license": "MIT"
}, },
"node_modules/@hasura/dc-api-types": { "node_modules/@hasura/dc-api-types": {
"version": "0.21.0", "version": "0.22.0",
"license": "Apache-2.0", "license": "Apache-2.0",
"devDependencies": { "devDependencies": {
"@tsconfig/node16": "^1.0.3", "@tsconfig/node16": "^1.0.3",

View File

@ -22,7 +22,7 @@
}, },
"dependencies": { "dependencies": {
"@fastify/cors": "^8.1.0", "@fastify/cors": "^8.1.0",
"@hasura/dc-api-types": "0.21.0", "@hasura/dc-api-types": "0.22.0",
"fastify-metrics": "^9.2.1", "fastify-metrics": "^9.2.1",
"fastify": "^4.4.0", "fastify": "^4.4.0",
"nanoid": "^3.3.4", "nanoid": "^3.3.4",

View File

@ -1,5 +1,5 @@
import { configSchema } from "./config" import { configSchema } from "./config"
import { METRICS, MUTATIONS } from "./environment" import { DATASETS, METRICS, MUTATIONS } from "./environment"
import { CapabilitiesResponse, ScalarTypeCapabilities } from "@hasura/dc-api-types" import { CapabilitiesResponse, ScalarTypeCapabilities } from "@hasura/dc-api-types"
@ -129,6 +129,7 @@ export const capabilitiesResponse: CapabilitiesResponse = {
), ),
explain: {}, explain: {},
raw: {}, raw: {},
... (DATASETS ? { datasets: {} } : {}),
... (METRICS ? { metrics: {} } : {}) ... (METRICS ? { metrics: {} } : {})
}, },
} }

View File

@ -0,0 +1,79 @@
import { connect, SqlLogger } from './db';
import { DatasetDeleteResponse, DatasetGetResponse, DatasetPostRequest, DatasetPostResponse } from '@hasura/dc-api-types';
import { promises, existsSync } from 'fs';
import { DATASET_CLONES, DATASET_DELETE, DATASET_TEMPLATES } from "./environment";
import path from 'path';
export async function getDataset(template_name: string): Promise<DatasetGetResponse> {
const path = mkTemplatePath(template_name);
if(existsSync(path)) {
const stats = await promises.stat(path);
if(stats.isFile()) {
return { exists: true };
} else {
return { exists: false };
}
} else {
return { exists: false };
}
}
export async function cloneDataset(logger: SqlLogger, clone_name: string, body: DatasetPostRequest): Promise<DatasetPostResponse> {
const fromPath = mkTemplatePath(body.from);
const toPath = mkClonePath(clone_name);
const fromStats = await promises.stat(fromPath);
const exists = existsSync(toPath);
if(fromStats.isFile() && ! exists) {
// Check if this is a real SQLite DB
const db = connect({ db: fromPath, explicit_main_schema: false, tables: [], meta: false }, logger);
if(db) {
db.close();
} else {
throw(Error("Dataset is not an SQLite Database!"))
}
await promises.cp(fromPath, toPath);
return { config: { db: toPath } };
} else if(exists) {
throw(Error("Dataset already exists!"))
} else {
throw(Error("Can't Clone!"))
}
}
export async function deleteDataset(clone_name: string): Promise<DatasetDeleteResponse> {
if(DATASET_DELETE) {
const path = mkClonePath(clone_name);
const exists = existsSync(path);
if(exists) {
const stats = await promises.stat(path);
if(stats.isFile()) {
await promises.rm(path);
return { message: "success" };
} else {
throw(Error("Dataset is not a file."));
}
} else {
throw(Error("Dataset does not exist."));
}
} else {
throw(Error("Dataset deletion not available."));
}
}
function mkTemplatePath(name: string): string {
const parsed = path.parse(name);
const safeName = parsed.base;
if(name != safeName) {
throw(Error(`Template name ${name} is not valid.`));
}
return path.join(DATASET_TEMPLATES, safeName);
}
function mkClonePath(name: string): string {
const parsed = path.parse(name);
const safeName = parsed.base;
if(name != safeName) {
throw(Error(`Template name ${name} is not valid.`));
}
return path.join(DATASET_CLONES, safeName);
}

View File

@ -38,4 +38,9 @@ export const DB_PRIVATECACHE = envToBool('DB_PRIVATECACHE');
export const DEBUGGING_TAGS = envToBool('DEBUGGING_TAGS'); export const DEBUGGING_TAGS = envToBool('DEBUGGING_TAGS');
export const QUERY_LENGTH_LIMIT = envToNum('QUERY_LENGTH_LIMIT', Infinity); export const QUERY_LENGTH_LIMIT = envToNum('QUERY_LENGTH_LIMIT', Infinity);
export const MUTATIONS = envToBool('MUTATIONS'); export const MUTATIONS = envToBool('MUTATIONS');
export const DATASETS = envToBool('DATASETS');
export const DATASET_DELETE = envToBool('DATASET_DELETE');
export const DATASET_TEMPLATES = envToString('DATASET_TEMPLATES', "./dataset_templates");
export const DATASET_CLONES = envToString('DATASET_CLONES', "./dataset_clones");

View File

@ -4,12 +4,13 @@ import { getSchema } from './schema';
import { explain, queryData } from './query'; import { explain, queryData } from './query';
import { getConfig, tryGetConfig } from './config'; import { getConfig, tryGetConfig } from './config';
import { capabilitiesResponse } from './capabilities'; import { capabilitiesResponse } from './capabilities';
import { QueryResponse, SchemaResponse, QueryRequest, CapabilitiesResponse, ExplainResponse, RawRequest, RawResponse, ErrorResponse, MutationRequest, MutationResponse } from '@hasura/dc-api-types'; import { QueryResponse, SchemaResponse, QueryRequest, CapabilitiesResponse, ExplainResponse, RawRequest, RawResponse, ErrorResponse, MutationRequest, MutationResponse, DatasetGetResponse, DatasetPostResponse, DatasetDeleteResponse, DatasetPostRequest, DatasetTemplateName } from '@hasura/dc-api-types';
import { connect } from './db'; import { connect } from './db';
import metrics from 'fastify-metrics'; import metrics from 'fastify-metrics';
import prometheus from 'prom-client'; import prometheus from 'prom-client';
import { runRawOperation } from './raw'; import { runRawOperation } from './raw';
import { LOG_LEVEL, METRICS, MUTATIONS, PERMISSIVE_CORS, PRETTY_PRINT_LOGS } from './environment'; import { DATASETS, DATASET_DELETE, LOG_LEVEL, METRICS, MUTATIONS, PERMISSIVE_CORS, PRETTY_PRINT_LOGS } from './environment';
import { cloneDataset, deleteDataset, getDataset } from './datasets';
const port = Number(process.env.PORT) || 8100; const port = Number(process.env.PORT) || 8100;
@ -143,7 +144,7 @@ server.post<{ Body: QueryRequest, Reply: ExplainResponse}>("/explain", async (re
return explain(config, sqlLogger, request.body); return explain(config, sqlLogger, request.body);
}); });
if (MUTATIONS) { if(MUTATIONS) {
server.post<{ Body: MutationRequest, Reply: MutationResponse}>("/mutation", async (request, _response) => { server.post<{ Body: MutationRequest, Reply: MutationResponse}>("/mutation", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "mutation.request"); server.log.info({ headers: request.headers, query: request.body, }, "mutation.request");
throw Error("Mutations not yet implemented"); throw Error("Mutations not yet implemented");
@ -170,6 +171,34 @@ server.get("/health", async (request, response) => {
} }
}); });
// Data-Set Features - Names must match files in the associated datasets directory.
// If they exist then they are tracked for the purposes of this feature in SQLite.
if(DATASETS) {
server.get<{ Params: { template_name: DatasetTemplateName, }, Reply: DatasetGetResponse }>("/datasets/templates/:template_name", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "datasets.templates.get");
const result = await getDataset(request.params.template_name);
if(! result.exists) {
_response.statusCode = 404;
}
return result;
});
// TODO: The name param here should be a DatasetCloneName, but this isn't being code-generated.
server.post<{ Params: { clone_name: string, }, Body: DatasetPostRequest, Reply: DatasetPostResponse }>("/datasets/clones/:clone_name", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "datasets.clones.post");
return cloneDataset(sqlLogger, request.params.clone_name, request.body);
});
// Only allow deletion if this is explicitly supported by ENV configuration
if(DATASET_DELETE) {
// TODO: The name param here should be a DatasetCloneName, but this isn't being code-generated.
server.delete<{ Params: { clone_name: string, }, Reply: DatasetDeleteResponse }>("/datasets/clones/:clone_name", async (request, _response) => {
server.log.info({ headers: request.headers, query: request.body, }, "datasets.clones.delete");
return deleteDataset(request.params.clone_name);
});
}
}
server.get("/", async (request, response) => { server.get("/", async (request, response) => {
response.type('text/html'); response.type('text/html');
return `<!DOCTYPE html> return `<!DOCTYPE html>
@ -190,6 +219,9 @@ server.get("/", async (request, response) => {
<li><a href="/raw">POST /raw - Raw Query Handler</a> <li><a href="/raw">POST /raw - Raw Query Handler</a>
<li><a href="/health">GET /health - Healthcheck</a> <li><a href="/health">GET /health - Healthcheck</a>
<li><a href="/metrics">GET /metrics - Prometheus formatted metrics</a> <li><a href="/metrics">GET /metrics - Prometheus formatted metrics</a>
<li><a href="/datasets/NAME">GET /datasets/{NAME} - Information on Dataset</a>
<li><a href="/datasets/NAME">POST /datasets/{NAME} - Create a Dataset</a>
<li><a href="/datasets/NAME">DELETE /datasets/{NAME} - Delete a Dataset</a>
</ul> </ul>
</body> </body>
</html> </html>

View File

@ -230,15 +230,15 @@ schemaInspectionTests opts = describe "Schema and Source Inspection" $ do
) -- Note: These fields are backend specific so we ignore their values and just verify their shapes: ) -- Note: These fields are backend specific so we ignore their values and just verify their shapes:
<&> Lens.set (key "config_schema_response" . key "other_schemas") J.Null <&> Lens.set (key "config_schema_response" . key "other_schemas") J.Null
<&> Lens.set (key "config_schema_response" . key "config_schema") J.Null <&> Lens.set (key "config_schema_response" . key "config_schema") J.Null
<&> Lens.set (key "capabilities" . _Object . Lens.at "datasets") Nothing
<&> Lens.set (key "options" . key "uri") J.Null <&> Lens.set (key "options" . key "uri") J.Null
<&> Lens.set (_Object . Lens.at "display_name") (Just J.Null) <&> Lens.set (_Object . Lens.at "display_name") Nothing
) )
[yaml| [yaml|
capabilities: *backendCapabilities capabilities: *backendCapabilities
config_schema_response: config_schema_response:
config_schema: null config_schema: null
other_schemas: null other_schemas: null
display_name: null
options: options:
uri: null uri: null
|] |]

View File

@ -88,6 +88,7 @@ library
Hasura.Backends.DataConnector.API.V0.Scalar Hasura.Backends.DataConnector.API.V0.Scalar
Hasura.Backends.DataConnector.API.V0.Schema Hasura.Backends.DataConnector.API.V0.Schema
Hasura.Backends.DataConnector.API.V0.Table Hasura.Backends.DataConnector.API.V0.Table
Hasura.Backends.DataConnector.API.V0.Dataset
other-modules: other-modules:
Hasura.Backends.DataConnector.API.V0.Name Hasura.Backends.DataConnector.API.V0.Name

View File

@ -149,6 +149,27 @@ type RawApi =
:> ReqBody '[JSON] V0.RawRequest :> ReqBody '[JSON] V0.RawRequest
:> Post '[JSON] V0.RawResponse :> Post '[JSON] V0.RawResponse
type DatasetGetApi =
"datasets"
:> "templates"
:> Capture "template_name" DatasetTemplateName
:> Get '[JSON] V0.DatasetGetResponse
type DatasetPostApi =
"datasets"
:> "clones"
:> Capture "clone_name" DatasetCloneName
:> ReqBody '[JSON] V0.DatasetPostRequest
:> Post '[JSON] V0.DatasetPostResponse
type DatasetDeleteApi =
"datasets"
:> "clones"
:> Capture "clone_name" DatasetCloneName
:> Delete '[JSON] V0.DatasetDeleteResponse
type DatasetApi = DatasetGetApi :<|> DatasetPostApi :<|> DatasetDeleteApi
data Prometheus data Prometheus
-- NOTE: This seems like quite a brittle definition and we may want to be -- NOTE: This seems like quite a brittle definition and we may want to be
@ -189,13 +210,17 @@ data Routes mode = Routes
-- | 'GET /metrics' -- | 'GET /metrics'
_metrics :: mode :- MetricsApi, _metrics :: mode :- MetricsApi,
-- | 'GET /raw' -- | 'GET /raw'
_raw :: mode :- RawApi _raw :: mode :- RawApi,
-- | 'GET /datasets/:template_name'
-- 'POST /datasets/:clone_name'
-- 'DELETE /datasets/:clone_name'
_datasets :: mode :- DatasetApi
} }
deriving stock (Generic) deriving stock (Generic)
-- | servant-openapi3 does not (yet) support NamedRoutes so we need to compose the -- | servant-openapi3 does not (yet) support NamedRoutes so we need to compose the
-- API the old-fashioned way using :<|> for use by @toOpenApi@ -- API the old-fashioned way using :<|> for use by @toOpenApi@
type Api = CapabilitiesApi :<|> SchemaApi :<|> QueryApi :<|> ExplainApi :<|> MutationApi :<|> HealthApi :<|> MetricsApi :<|> RawApi type Api = CapabilitiesApi :<|> SchemaApi :<|> QueryApi :<|> ExplainApi :<|> MutationApi :<|> HealthApi :<|> MetricsApi :<|> RawApi :<|> DatasetApi
-- | Provide an OpenApi 3.0 schema for the API -- | Provide an OpenApi 3.0 schema for the API
openApiSchema :: OpenApi openApiSchema :: OpenApi

View File

@ -14,6 +14,7 @@ module Hasura.Backends.DataConnector.API.V0
module Scalar, module Scalar,
module Schema, module Schema,
module Table, module Table,
module Dataset,
) )
where where
@ -21,6 +22,7 @@ import Hasura.Backends.DataConnector.API.V0.Aggregate as Aggregate
import Hasura.Backends.DataConnector.API.V0.Capabilities as Capabilities import Hasura.Backends.DataConnector.API.V0.Capabilities as Capabilities
import Hasura.Backends.DataConnector.API.V0.Column as Column import Hasura.Backends.DataConnector.API.V0.Column as Column
import Hasura.Backends.DataConnector.API.V0.ConfigSchema as ConfigSchema import Hasura.Backends.DataConnector.API.V0.ConfigSchema as ConfigSchema
import Hasura.Backends.DataConnector.API.V0.Dataset as Dataset
import Hasura.Backends.DataConnector.API.V0.ErrorResponse as ErrorResponse import Hasura.Backends.DataConnector.API.V0.ErrorResponse as ErrorResponse
import Hasura.Backends.DataConnector.API.V0.Explain as Explain import Hasura.Backends.DataConnector.API.V0.Explain as Explain
import Hasura.Backends.DataConnector.API.V0.Expression as Expression import Hasura.Backends.DataConnector.API.V0.Expression as Expression

View File

@ -29,6 +29,7 @@ module Hasura.Backends.DataConnector.API.V0.Capabilities
MetricsCapabilities (..), MetricsCapabilities (..),
ExplainCapabilities (..), ExplainCapabilities (..),
RawCapabilities (..), RawCapabilities (..),
DatasetCapabilities (..),
CapabilitiesResponse (..), CapabilitiesResponse (..),
) )
where where
@ -81,14 +82,15 @@ data Capabilities = Capabilities
_cComparisons :: Maybe ComparisonCapabilities, _cComparisons :: Maybe ComparisonCapabilities,
_cMetrics :: Maybe MetricsCapabilities, _cMetrics :: Maybe MetricsCapabilities,
_cExplain :: Maybe ExplainCapabilities, _cExplain :: Maybe ExplainCapabilities,
_cRaw :: Maybe RawCapabilities _cRaw :: Maybe RawCapabilities,
_cDatasets :: Maybe DatasetCapabilities
} }
deriving stock (Eq, Show, Generic) deriving stock (Eq, Show, Generic)
deriving anyclass (NFData, Hashable) deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Capabilities deriving (FromJSON, ToJSON, ToSchema) via Autodocodec Capabilities
defaultCapabilities :: Capabilities defaultCapabilities :: Capabilities
defaultCapabilities = Capabilities defaultDataSchemaCapabilities Nothing Nothing Nothing mempty Nothing Nothing Nothing Nothing Nothing defaultCapabilities = Capabilities defaultDataSchemaCapabilities Nothing Nothing Nothing mempty Nothing Nothing Nothing Nothing Nothing Nothing
instance HasCodec Capabilities where instance HasCodec Capabilities where
codec = codec =
@ -104,6 +106,7 @@ instance HasCodec Capabilities where
<*> optionalField "metrics" "The agent's metrics capabilities" .= _cMetrics <*> optionalField "metrics" "The agent's metrics capabilities" .= _cMetrics
<*> optionalField "explain" "The agent's explain capabilities" .= _cExplain <*> optionalField "explain" "The agent's explain capabilities" .= _cExplain
<*> optionalField "raw" "The agent's raw query capabilities" .= _cRaw <*> optionalField "raw" "The agent's raw query capabilities" .= _cRaw
<*> optionalField "datasets" "The agent's dataset capabilities" .= _cDatasets
data DataSchemaCapabilities = DataSchemaCapabilities data DataSchemaCapabilities = DataSchemaCapabilities
{ _dscSupportsPrimaryKeys :: Bool, { _dscSupportsPrimaryKeys :: Bool,
@ -419,6 +422,15 @@ instance HasCodec RawCapabilities where
codec = codec =
object "RawCapabilities" $ pure RawCapabilities object "RawCapabilities" $ pure RawCapabilities
data DatasetCapabilities = DatasetCapabilities {}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving anyclass (NFData, Hashable)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec DatasetCapabilities
instance HasCodec DatasetCapabilities where
codec =
object "DatasetCapabilities" $ pure DatasetCapabilities
data CapabilitiesResponse = CapabilitiesResponse data CapabilitiesResponse = CapabilitiesResponse
{ _crCapabilities :: Capabilities, { _crCapabilities :: Capabilities,
_crConfigSchemaResponse :: ConfigSchemaResponse, _crConfigSchemaResponse :: ConfigSchemaResponse,

View File

@ -2,6 +2,7 @@
module Hasura.Backends.DataConnector.API.V0.ConfigSchema module Hasura.Backends.DataConnector.API.V0.ConfigSchema
( Config (..), ( Config (..),
emptyConfig,
ConfigSchemaResponse (..), ConfigSchemaResponse (..),
validateConfigAgainstConfigSchema, validateConfigAgainstConfigSchema,
) )
@ -11,10 +12,12 @@ import Autodocodec qualified
import Control.DeepSeq (NFData) import Control.DeepSeq (NFData)
import Control.Lens ((%~), (&), (.~), (^?)) import Control.Lens ((%~), (&), (.~), (^?))
import Data.Aeson (FromJSON (..), Object, ToJSON (..), Value (..), eitherDecode, encode, object, withObject, (.:), (.=), (<?>)) import Data.Aeson (FromJSON (..), Object, ToJSON (..), Value (..), eitherDecode, encode, object, withObject, (.:), (.=), (<?>))
import Data.Aeson.KeyMap (empty)
import Data.Aeson.Lens (AsValue (..), key, members, values) import Data.Aeson.Lens (AsValue (..), key, members, values)
import Data.Aeson.Types (JSONPathElement (..)) import Data.Aeson.Types (JSONPathElement (..), emptyObject)
import Data.Bifunctor (first) import Data.Bifunctor (first)
import Data.ByteString.Lazy qualified as BSL import Data.ByteString.Lazy qualified as BSL
import Data.Data (Data)
import Data.HashMap.Strict.InsOrd qualified as InsOrdHashMap import Data.HashMap.Strict.InsOrd qualified as InsOrdHashMap
import Data.Hashable (Hashable) import Data.Hashable (Hashable)
import Data.Maybe (fromMaybe) import Data.Maybe (fromMaybe)
@ -28,9 +31,17 @@ import Servant.API (FromHttpApiData (..), ToHttpApiData (..))
import Prelude import Prelude
newtype Config = Config {unConfig :: Object} newtype Config = Config {unConfig :: Object}
deriving stock (Eq, Show, Ord) deriving stock (Eq, Show, Ord, Data)
deriving newtype (Hashable, NFData, ToJSON, FromJSON) deriving newtype (Hashable, NFData, ToJSON, FromJSON)
emptyConfig :: Config
emptyConfig = Config empty
instance Autodocodec.HasCodec Config where
codec =
Autodocodec.named "Config" $
Autodocodec.dimapCodec Config unConfig Autodocodec.codec
instance FromHttpApiData Config where instance FromHttpApiData Config where
parseUrlPiece = first Text.pack . eitherDecode . BSL.fromStrict . Text.encodeUtf8 parseUrlPiece = first Text.pack . eitherDecode . BSL.fromStrict . Text.encodeUtf8
parseHeader = first Text.pack . eitherDecode . BSL.fromStrict parseHeader = first Text.pack . eitherDecode . BSL.fromStrict

View File

@ -0,0 +1,127 @@
{-# LANGUAGE OverloadedLists #-}
{-# LANGUAGE TemplateHaskell #-}
module Hasura.Backends.DataConnector.API.V0.Dataset
( DatasetTemplateName (..),
DatasetCloneName (..),
DatasetGetResponse (..),
DatasetPostRequest (..),
DatasetPostResponse (..),
DatasetDeleteResponse (..),
datasetGetSuccess,
datasetDeleteSuccess,
-- | Lenses
unDatasetTemplateName,
unDatasetCloneName,
dsExists,
dspFrom,
dspConfig,
dsdMessage,
)
where
import Autodocodec.Extended
import Autodocodec.OpenAPI ()
import Control.Lens ((&), (?~))
import Control.Lens.TH (makeLenses)
import Data.Aeson (FromJSON, ToJSON, Value)
import Data.Data (Data)
import Data.HashMap.Strict qualified as H
import Data.OpenApi (HasType (type_), OpenApiType (OpenApiString), ToParamSchema, ToSchema)
import Data.OpenApi.Internal.ParamSchema (ToParamSchema (toParamSchema))
import Data.Text (Text)
import GHC.Generics (Generic)
import Hasura.Backends.DataConnector.API.V0.ConfigSchema qualified as Config
import Servant.API (FromHttpApiData (parseUrlPiece), ToHttpApiData (toUrlPiece))
import Prelude
newtype DatasetTemplateName = DatasetTemplateName
{ _unDatasetTemplateName :: Text
}
deriving stock (Eq, Ord, Show, Data)
deriving newtype (ToParamSchema, ToHttpApiData, FromHttpApiData)
deriving (ToJSON, FromJSON, ToSchema) via Autodocodec DatasetTemplateName
instance HasCodec DatasetTemplateName where
codec =
named "DatasetTemplateName" $
dimapCodec DatasetTemplateName _unDatasetTemplateName codec
$(makeLenses ''DatasetTemplateName)
newtype DatasetCloneName = DatasetCloneName
{ _unDatasetCloneName :: Text
}
deriving stock (Eq, Ord, Show, Data)
deriving newtype (ToParamSchema, ToHttpApiData, FromHttpApiData)
deriving (ToJSON, FromJSON, ToSchema) via Autodocodec DatasetCloneName
instance HasCodec DatasetCloneName where
codec =
named "DatasetCloneName" $
dimapCodec DatasetCloneName _unDatasetCloneName codec
$(makeLenses ''DatasetCloneName)
-- | Request Dataset Info
data DatasetGetResponse = DatasetGetResponse
{ _dsExists :: Bool
}
deriving stock (Eq, Ord, Show, Data)
deriving (ToJSON, FromJSON, ToSchema) via Autodocodec DatasetGetResponse
datasetGetSuccess :: DatasetGetResponse
datasetGetSuccess = DatasetGetResponse True
instance HasCodec DatasetGetResponse where
codec =
object "DatasetGetResponse" $
DatasetGetResponse
<$> requiredField "exists" "Message detailing if the dataset exists" .= _dsExists
$(makeLenses ''DatasetGetResponse)
-- | Create a new Dataset
data DatasetPostRequest = DatasetPostRequest
{_dspFrom :: DatasetTemplateName}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec DatasetPostRequest
instance HasCodec DatasetPostRequest where
codec =
object "DatasetPostRequest" $
DatasetPostRequest
<$> requiredField "from" "The named dataset to clone from" .= _dspFrom
$(makeLenses ''DatasetPostRequest)
data DatasetPostResponse = DatasetPostResponse
{_dspConfig :: Config.Config}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec DatasetPostResponse
instance HasCodec DatasetPostResponse where
codec =
object "DatasetPostResponse" $
DatasetPostResponse
<$> requiredField "config" "A config to connect to the cloned dataset" .= _dspConfig
$(makeLenses ''DatasetPostResponse)
-- | Delete a Dataset
data DatasetDeleteResponse = DatasetDeleteResponse
{ _dsdMessage :: Text
}
deriving stock (Eq, Ord, Show, Generic, Data)
deriving (FromJSON, ToJSON, ToSchema) via Autodocodec DatasetDeleteResponse
datasetDeleteSuccess :: DatasetDeleteResponse
datasetDeleteSuccess = DatasetDeleteResponse "success"
instance HasCodec DatasetDeleteResponse where
codec =
object "DatasetDeleteResponse" $
DatasetDeleteResponse
<$> requiredField "message" "The named dataset to clone from" .= _dsdMessage
$(makeLenses ''DatasetDeleteResponse)

View File

@ -1,4 +1,5 @@
{-# LANGUAGE QuasiQuotes #-} {-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE TypeOperators #-}
-- | Mock Agent Warp server backend -- | Mock Agent Warp server backend
module Harness.Backend.DataConnector.Mock.Server module Harness.Backend.DataConnector.Mock.Server
@ -67,7 +68,8 @@ capabilities =
}, },
API._cMetrics = Just API.MetricsCapabilities {}, API._cMetrics = Just API.MetricsCapabilities {},
API._cExplain = Just API.ExplainCapabilities {}, API._cExplain = Just API.ExplainCapabilities {},
API._cRaw = Just API.RawCapabilities {} API._cRaw = Just API.RawCapabilities {},
API._cDatasets = Just API.DatasetCapabilities {}
}, },
_crConfigSchemaResponse = _crConfigSchemaResponse =
API.ConfigSchemaResponse API.ConfigSchemaResponse
@ -806,6 +808,13 @@ metricsHandler = pure "# NOTE: Metrics would go here."
rawHandler :: API.SourceName -> API.Config -> API.RawRequest -> Handler API.RawResponse rawHandler :: API.SourceName -> API.Config -> API.RawRequest -> Handler API.RawResponse
rawHandler _ _ _ = pure $ API.RawResponse [] -- NOTE: Raw query response would go here. rawHandler _ _ _ = pure $ API.RawResponse [] -- NOTE: Raw query response would go here.
datasetHandler :: (API.DatasetTemplateName -> Handler API.DatasetGetResponse) :<|> ((API.DatasetCloneName -> API.DatasetPostRequest -> Handler API.DatasetPostResponse) :<|> (API.DatasetCloneName -> Handler API.DatasetDeleteResponse))
datasetHandler = datasetGetHandler :<|> datasetPostHandler :<|> datasetDeleteHandler
where
datasetGetHandler _ = pure $ API.datasetGetSuccess
datasetPostHandler _ _ = pure $ API.DatasetPostResponse API.emptyConfig
datasetDeleteHandler _ = pure $ API.datasetDeleteSuccess
dcMockableServer :: I.IORef MockConfig -> I.IORef (Maybe AgentRequest) -> I.IORef (Maybe API.Config) -> Server API.Api dcMockableServer :: I.IORef MockConfig -> I.IORef (Maybe AgentRequest) -> I.IORef (Maybe API.Config) -> Server API.Api
dcMockableServer mcfg mRecordedRequest mRecordedRequestConfig = dcMockableServer mcfg mRecordedRequest mRecordedRequestConfig =
mockCapabilitiesHandler mcfg mockCapabilitiesHandler mcfg
@ -816,6 +825,7 @@ dcMockableServer mcfg mRecordedRequest mRecordedRequestConfig =
:<|> healthcheckHandler :<|> healthcheckHandler
:<|> metricsHandler :<|> metricsHandler
:<|> rawHandler :<|> rawHandler
:<|> datasetHandler
mockAgentPort :: Warp.Port mockAgentPort :: Warp.Port
mockAgentPort = 65006 mockAgentPort = 65006

View File

@ -130,6 +130,9 @@ genExplainCapabilities = pure ExplainCapabilities {}
genRawCapabilities :: MonadGen m => m RawCapabilities genRawCapabilities :: MonadGen m => m RawCapabilities
genRawCapabilities = pure RawCapabilities {} genRawCapabilities = pure RawCapabilities {}
genDatasetCapabilities :: MonadGen m => m DatasetCapabilities
genDatasetCapabilities = pure DatasetCapabilities {}
genCapabilities :: Gen Capabilities genCapabilities :: Gen Capabilities
genCapabilities = genCapabilities =
Capabilities Capabilities
@ -143,6 +146,7 @@ genCapabilities =
<*> Gen.maybe genMetricsCapabilities <*> Gen.maybe genMetricsCapabilities
<*> Gen.maybe genExplainCapabilities <*> Gen.maybe genExplainCapabilities
<*> Gen.maybe genRawCapabilities <*> Gen.maybe genRawCapabilities
<*> Gen.maybe genDatasetCapabilities
emptyConfigSchemaResponse :: ConfigSchemaResponse emptyConfigSchemaResponse :: ConfigSchemaResponse
emptyConfigSchemaResponse = ConfigSchemaResponse mempty mempty emptyConfigSchemaResponse = ConfigSchemaResponse mempty mempty