Move Typescript types for Data Connector agent into their own package

PR-URL: https://github.com/hasura/graphql-engine-mono/pull/5596
GitOrigin-RevId: c5da90eb4e61a9d9a5ddc34f7bfbaa2d00c698b8
This commit is contained in:
Daniel Chambers 2022-09-05 16:08:14 +10:00 committed by hasura-bot
parent 0b7353657c
commit 97b0e4c591
166 changed files with 7966 additions and 7343 deletions

1
dc-agents/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
node_modules

1
dc-agents/.npmrc Normal file
View File

@ -0,0 +1 @@
//registry.npmjs.org/:_authToken=${NPM_TOKEN}

1
dc-agents/.nvmrc Normal file
View File

@ -0,0 +1 @@
v16.17.0

74
dc-agents/CONTRIBUTING.md Normal file
View File

@ -0,0 +1,74 @@
# Contributing to the Data Connector Agents
## Getting Set Up
Requirements:
- NodeJS 16 - Easiest way to install is to use [nvm](https://github.com/nvm-sh/nvm), then do `nvm use` to use the correct NodeJS version via the `.nvmrc` file
Once node is installed, run `npm ci` to restore all npm packages.
## Project Structure
- `dc-api-types` - These are the TypeScript types, generated from the OpenAPI spec of the Data Connector Agent API. The OpenAPI spec is generated from our Haskell types in `/server/src-dc-api`.
- `reference` - The Reference agent that serves as an example implementation of a Data Connector agent
- `sqlite` - The SQLite Data Connector agent
- `sdk` - Assets that go into the Data Connector SDK zip file
- `scripts` - Scripts used for managing the codebase
`dc-api-types`, `sqlite` and `reference` are all npm packages that are included in the root workspace defined by `package.json`. Linking them via [workspaces](https://docs.npmjs.com/cli/v7/using-npm/workspaces) means you can change the `dc-api-types` and have those changes immediately flow into the `reference` and `sqlite` packages.
To restore the npm modules used by all the projects, ensure you run `npm ci` in the `/dc-agents` directory (ie. this directory).
### Deriving lockfiles
Because `sqlite` and `reference` are linked into the root workspace, they don't normally get their own lockfiles (ie. `package-lock.json`), as the lockfile is managed at the root workspace level. However, we want to be able to take these projects and build them outside of the workspace setup we have here, where the root `package-lock.json` does not exist.
In order to achieve this, we have a tool that will derive individual `package-lock.json` files for the `reference` and `sqlite` packages from the root `package-lock.json` file. These derived `package-lock.json` files are committed to the repository so that they can be used by the package-specific Dockerfiles (eg `reference/Dockerfile`).
This means that whenever you modify the root `package-lock.json` (which will happen whenever you change the dependencies in any of the packages), you need to re-derive the individual packages' `package-lock.json` files. You can do that easily by running
```bash
> make derive-lockfiles
```
There is more information about how this derivation process works inside the script that does the derivation (`scripts/derive-lockfile.ts`).
### Docker
There are actually two Dockerfiles for each agent; for example there is both `Dockerfile-reference`, and `reference/Dockerfile`.
`Dockerfile-reference` builds a Docker container that will run the Reference agent, by copying the root workspace into the container and maintaining the workspace structure inside the container. This is useful if you need a Docker container with a Reference agent that is using as-of-yet unpublished (to npm) changes in the `dc-api-types` package, as it will include `dc-api-types` into the Docker container. This dockerfile is good to use while developing.
On the other hand, `reference/Dockerfile` will build a Docker container that will run the Reference agent, but it builds the agent independent of the workspace, so it will try to restore the `dc-api-types` package from npm (via the derived `reference/package-lock.json` file). This is good for official releases of the Reference agent, where all dependencies have been published to npm already and are available for package restore.
## Running the agents
Ensure you have run `npm ci` before doing the following.
### Reference agent
```bash
> make start-reference-agent
```
### SQLite agent
```bash
> make start-sqlite-agent
```
## Generating the TypeScript types (`dc-api-types`)
To regenerate the TypeScript types from the Haskell types, run
```bash
> make regenerate-types
```
This will regenerate the types, bump the version number in the `dc-api-types` project, update the agents to use the new version number, and update and re-derive all the lockfiles.
To only (re)generate the TypeScript types from the OpenAPI spec (ie. `dc-api-types/src/agent.openapi.json`), run
```bash
> make generate-types
```
If you need to manually change the version number in the `dc-api-types` project, you can update all dependencies to use the new version automatically by running
```
> make update-api-types-deps
```
## Publishing the TypeScript types package (`dc-api-types`) to npm
The TypeScript types package in `dc-api-types` will be published to npm by the continuous integration build system on every commit to main. It will only publish the package if the version specified in `dc-api-types/package.json` hasn't already been published.

View File

@ -0,0 +1,24 @@
FROM node:16-alpine
WORKDIR /app
COPY package.json .
COPY package-lock.json .
COPY dc-api-types dc-api-types
WORKDIR /app/reference
COPY ./reference/package.json .
RUN npm ci
COPY ./reference/tsconfig.json .
COPY ./reference/src src
# This is just to ensure everything compiles ahead of time.
# We'll actually run using ts-node to ensure we get TypesScript
# stack traces if something fails at runtime.
RUN npm run typecheck
EXPOSE 8100
# We don't bother doing typechecking when we run (only TS->JS transpiling)
# because we checked it above already. This uses less memory at runtime.
CMD [ "npm", "run", "--silent", "start-no-typecheck" ]

View File

@ -0,0 +1,24 @@
FROM node:16-alpine
WORKDIR /app
COPY package.json .
COPY package-lock.json .
COPY dc-api-types dc-api-types
WORKDIR /app/sqlite
COPY ./sqlite/package.json .
RUN npm ci
COPY ./sqlite/tsconfig.json .
COPY ./sqlite/src src
# This is just to ensure everything compiles ahead of time.
# We'll actually run using ts-node to ensure we get TypesScript
# stack traces if something fails at runtime.
RUN npm run typecheck
EXPOSE 8100
# We don't bother doing typechecking when we run (only TS->JS transpiling)
# because we checked it above already. This uses less memory at runtime.
CMD [ "npm", "run", "--silent", "start-no-typecheck" ]

62
dc-agents/Makefile Normal file
View File

@ -0,0 +1,62 @@
SHELL := bash -e -u -o pipefail
# default target
.PHONY: help
## help: prints help message
help:
@echo "Usage:"
@sed -n 's/^##//p' ${MAKEFILE_LIST} | column -t -s ':' | sed -e 's/^/ /'
.PHONY: typecheck
## typecheck: Typechecks all workspaces
typecheck: typecheck-dc-api-types typecheck-reference-agent typecheck-sqlite-agent
.PHONY: typecheck-dc-api-types
## typecheck-dc-api-types: Typechecks the dc-api-types
typecheck-dc-api-types:
npm run -w dc-api-types typecheck
.PHONY: typecheck-reference-agent
## typecheck-reference-agent: Typechecks the Reference agent
typecheck-reference-agent:
npm run -w reference typecheck
.PHONY: typecheck-sqlite-agent
## typecheck-sqlite-agent: Typechecks the SQLite agent
typecheck-sqlite-agent:
npm run -w sqlite typecheck
.PHONY: start-reference-agent
## start-reference-agent: Starts the Reference agent
start-reference-agent:
npm start -w reference
.PHONY: start-sqlite-agent
## start-sqlite-agent: Starts the SQLite agent
start-sqlite-agent:
npm start -w sqlite
TESTS_DC_API := cabal run test:tests-dc-api --
.PHONY: generate-types
## generate-types: Generates the TypeScript API types in dc-api-types
generate-types: export TESTS_DC_API := $(TESTS_DC_API)
generate-types:
./scripts/generate-types.sh
.PHONY: regenerate-types
## regenerate-types: Regenerates the TypeScript API types in dc-api-types from the original Haskell types
regenerate-types: export TESTS_DC_API := $(TESTS_DC_API)
regenerate-types:
rm -f ./dc-api-types/src/agent.openapi.json
./scripts/generate-types.sh
.PHONY: update-api-types-deps
## update-api-types-deps: Updates packages that are dependant on dc-api-types with its current version from its package.json
update-api-types-deps:
./scripts/update-api-types-deps.sh
.PHONY: derive-lockfiles
## derive-lockfiles: Derives individual lockfiles for the workspace packages from the lockfile in the root project
derive-lockfiles:
npx ts-node ./scripts/derive-lockfile.ts -l package-lock.json -w reference -w sqlite

View File

@ -0,0 +1,5 @@
# Hasura GraphQL Engine Data Connector Agent API Types
This package contains TypeScript types that Data Connector agents implemented in TypeScript could use to correctly implement the API required.
The [Data Connector Reference agent](https://github.com/hasura/graphql-engine/tree/master/dc-agents/reference) can be used as an example of these types being used in practice to implement a Data Connector agent.

View File

@ -0,0 +1,36 @@
{
"name": "@hasura/dc-api-types",
"version": "0.3.0",
"description": "Hasura GraphQL Engine Data Connector Agent API types",
"author": "Hasura (https://github.com/hasura/graphql-engine)",
"license": "Apache-2.0",
"repository": {
"type": "git",
"url": "git+https://github.com/hasura/graphql-engine.git"
},
"bugs": {
"url": "https://github.com/hasura/graphql-engine/issues"
},
"homepage": "https://github.com/hasura/graphql-engine#readme",
"keywords": [
"hasura",
"data-connectors"
],
"types": "./src/index.ts",
"exports": "./src/index.ts",
"files": [
"./src",
"./README.md"
],
"dependencies": {},
"scripts": {
"build": "tsc",
"typecheck": "tsc --noEmit",
"test": "echo \"Error: no test specified\" && exit 1"
},
"devDependencies": {
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"typescript": "^4.7.4"
}
}

View File

@ -0,0 +1,10 @@
{
"extends": "@tsconfig/node16/tsconfig.json",
"compilerOptions": {
"outDir": "dist",
"resolveJsonModule": true,
},
"include": [
"src/**/*"
]
}

5145
dc-agents/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

25
dc-agents/package.json Normal file
View File

@ -0,0 +1,25 @@
{
"name": "@hasura/dc-agents",
"private": true,
"workspaces": [
"dc-api-types",
"reference",
"sqlite"
],
"scripts": {
"generate-types": "./scripts/generate-types.sh",
"update-api-types-deps": "./scripts/update-api-types-deps.sh",
"derive-lockfiles": "ts-node ./scripts/derive-lockfile.ts -l package-lock.json -w reference -w sqlite"
},
"dependencies": {
"yargs": "^17.5.1"
},
"devDependencies": {
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/yargs": "^17.0.11",
"openapi-typescript-codegen": "^0.23.0",
"ts-node": "^10.9.1",
"typescript": "^4.7.4"
}
}

View File

@ -1 +1 @@
lts/gallium
v16.17.0

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,6 @@
{
"name": "dc-agent-reference",
"version": "1.0.0",
"name": "@hasura/dc-agent-reference",
"version": "0.1.0",
"description": "Reference implementation of a Data Connector Agent for Hasura GraphQL Engine",
"author": "Hasura (https://github.com/hasura/graphql-engine)",
"license": "Apache-2.0",
@ -18,22 +18,21 @@
"typecheck": "tsc --noEmit",
"start": "ts-node ./src/index.ts",
"start-no-typecheck": "ts-node --transpileOnly ./src/index.ts",
"test": "echo \"Error: no test specified\" && exit 1",
"generate-types": "./scripts/generate-types.sh"
"test": "echo \"Error: no test specified\" && exit 1"
},
"dependencies": {
"@fastify/cors": "^7.0.0",
"@hasura/dc-api-types": "0.3.0",
"fastify": "^3.29.0",
"mathjs": "^11.0.0",
"pino-pretty": "^8.0.0",
"xml2js": "github:Leonidas-from-XIV/node-xml2js"
},
"devDependencies": {
"@tsconfig/node16": "^1.0.2",
"@types/node": "^16.11.38",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/xml2js": "^0.4.11",
"openapi-typescript-codegen": "^0.23.0",
"ts-node": "^10.8.1",
"typescript": "^4.7.3"
"ts-node": "^10.9.1",
"typescript": "^4.7.4"
}
}

View File

@ -1,21 +0,0 @@
#!/usr/bin/env bash
set -euo pipefail
PROJECT_ROOT="$( cd "$( dirname "${BASH_SOURCE[0]}" )/.." >/dev/null 2>&1 && pwd )" # ... https://stackoverflow.com/a/246128/176841
cd "$PROJECT_ROOT"
TYPES_DIR="./src/types"
SCHEMA_FILE="$TYPES_DIR/agent.openapi.json"
mkdir -p $TYPES_DIR
if [ ! -f $SCHEMA_FILE ] ; then
echo "$SCHEMA_FILE does not exist, re-generating it using the agent test suite"
cabal run test:tests-dc-api -- export-openapi-spec | tail -n 1 | jq > $SCHEMA_FILE
fi
echo "Deleting existing generated model..."
rm -rf "$TYPES_DIR/models"
rm -f "$TYPES_DIR/index.ts"
echo "Generating model from $SCHEMA_FILE..."
openapi --useUnionTypes --input $SCHEMA_FILE --output $TYPES_DIR --exportServices false --exportCore false --indent 2

View File

@ -1,5 +1,5 @@
import { configSchema } from "./config"
import { CapabilitiesResponse } from "./types"
import { CapabilitiesResponse } from "@hasura/dc-api-types"
export const capabilitiesResponse: CapabilitiesResponse = {
capabilities: { relationships: {} },

View File

@ -1,5 +1,5 @@
import { FastifyRequest } from "fastify"
import { ConfigSchemaResponse } from "./types"
import { ConfigSchemaResponse } from "@hasura/dc-api-types"
export type Config = {
tables: string[] | null

View File

@ -1,4 +1,4 @@
import { SchemaResponse, TableName } from "../types"
import { SchemaResponse, TableName } from "@hasura/dc-api-types"
import { Config } from "../config";
import xml2js from "xml2js"
import fs from "fs"

View File

@ -4,7 +4,7 @@ import { filterAvailableTables, getSchema, getTable, loadStaticData } from './da
import { queryData } from './query';
import { getConfig } from './config';
import { capabilitiesResponse } from './capabilities';
import { CapabilitiesResponse, SchemaResponse, QueryRequest, QueryResponse } from './types';
import { CapabilitiesResponse, SchemaResponse, QueryRequest, QueryResponse } from '@hasura/dc-api-types';
const port = Number(process.env.PORT) || 8100;
const server = Fastify({ logger: { prettyPrint: true } });

View File

@ -1,4 +1,4 @@
import { QueryRequest, TableRelationships, Relationship, Query, Field, OrderBy, Expression, BinaryComparisonOperator, UnaryComparisonOperator, BinaryArrayComparisonOperator, ComparisonColumn, ComparisonValue, ScalarValue, Aggregate, SingleColumnAggregate, ColumnCountAggregate, TableName, OrderByElement, OrderByRelation } from "./types";
import { QueryRequest, TableRelationships, Relationship, Query, Field, OrderBy, Expression, BinaryComparisonOperator, UnaryComparisonOperator, BinaryArrayComparisonOperator, ComparisonColumn, ComparisonValue, ScalarValue, Aggregate, SingleColumnAggregate, ColumnCountAggregate, TableName, OrderByElement, OrderByRelation } from "@hasura/dc-api-types";
import { coerceUndefinedToNull, crossProduct, tableNameEquals, unreachable, zip } from "./util";
import * as math from "mathjs";

View File

@ -1,4 +1,4 @@
import { TableName } from "./types";
import { TableName } from "@hasura/dc-api-types";
export const coerceUndefinedToNull = <T>(v: T | undefined): T | null => v === undefined ? null : v;

View File

@ -0,0 +1,309 @@
/* npm workspaces work by installing workspace packages into the root workspace folder's node_modules
* folder using a symlink to the actual package folder on disk. Doing this means that any changes
* to a workspace package automatically shows up in dependant packages since they are transparently
* looking at its source code directly via the symlink.
*
* For example, the `reference` package is dependent on the `dc-api-types` package. Because they are
* workspaces, npm will create a symlink from `./node_modules/@hasura/dc-api-types` to
* `./dc-api-types`. Therefore, when the `reference` package looks at files in
* `./node_modules/@hasura/dc-api-types` it is actually seeing `./dc-api-types`.
*
* The lockfile for a workspace is created at the root level (ie `./package-lock.json`) and _not_
* at the individual package level (ie. _not_ `./reference/package-lock.json`). This is a problem
* if you want to be able to work with that package outside of the workspace (for example, if you
* copy it into a Docker container to run it, or export it to a different repo using copybara).
*
* The purpose of this script is to derive a lockfile for a workspace project from the lockfile of
* the root workspace. We do this by following all dependencies from the workspace project and
* lifting them up from being located in `./<workspace project>/node_modules` to being in
* `./node_modules`.
*
* For example:
* `.packages["reference/node_modules/fastify"]` moves to `.packages["node_modules/fastify"]`
* (The `.packages` object in the lockfile has properties that each represent the installed
* location of a package on disk (the property name), and information about what is installed
* there (the property value))
*
* However, it is not that simple unfortunately. There is a bit of rejigging that needs to be done
* in some cases.
*
* npm's root lockfile has special entries in it when it symlinks a package (for example see
* `.packages["node_modules/@hasura/dc-api-types"]`). These need to be replaced with the
* information about the symlinked package directly since the derived lockfiles are to be used
* independently outside the context of the workspace and therefore a symlink will not exist.
*
* npm also floats dependencies that are shared between workspace packages, or are used by the
* root workspace package up to the root package level. When these dependencies are used by the
* package we're deriving a lockfile for, we leave them at the root level, since the source
* location (ie `./node_modules`) is the same as the destination.
*
* However, sometimes a dependency may exist at the root level, but a package depended on the
* workspace package may require a different version (since npm allows different versions of
* the same package to exist and be used simultaneously). For example, the `camelcase` v6
* package exists at `node_modules/camelcase` because `openapi-typescript-codegen` requires it,
* and it has been lifted to the root `node_modules`. However, `camelcase` is also required by
* the args package which is used by a dependency of the `reference` package but the `args`
* package requires v5, not v6. So npm has installed v5 `camelcase` at
* `reference/node_modules/camelcase` so that it won't conflict with v6 in
* `node_modules/camelcase`. But this means we now have two versions of `camelcase` and if we lift
* `reference/node_modules/camelcase` into `node_modules/camelcase`, we will replace v6 with v5.
* The solution is to push v5 down one level of the package path of the of the package that
* depends on it. So, because `args` (located in `reference/node_modules/args`) depends on
* `camelcase` v5, we move it from `reference/node_modules/camelcase` to
* `node_modules/args/node_modules/camelcase` in the derived lockfile. Now v6 and v5 can co-exist.
*
* Usage:
*
* Derive lockfiles for both the `reference` and `sqlite` workspace packages
* > npx ts-node ./scripts/derive-lockfile.ts -l package-lock.json -w reference -w sqlite
*/
import fs from "fs/promises";
import os from "os";
import path from "path";
import * as yargs from "yargs";
type Lockfile = {
name: string,
version?: string,
requires: boolean,
lockfileVersion: number,
packages: Record<string, Package>,
dependencies?: Record<string, unknown>,
}
type Package = {
link?: true,
resolved?: string
name?: string,
version?: string,
dependencies?: Record<string, string>,
devDependencies?: Record<string, string>,
peerDependencies?: Record<string, string>,
peerDependenciesMeta?: Record<string, { optional?: boolean }>,
}
type LocatedPackage = {
dependantPackagePath: PackagePath,
packagePath: PackagePath,
package: Package,
}
type Dependency = {
name: string,
package: LocatedPackage,
}
// Example paths:
// reference/node_modules/avvio/node_modules/debug -> { workspace: "reference", path: ["avvio", "debug"] }
// node_modules/ts-node -> { workspace: null, path: ["ts-node"] }
type PackagePath = {
// Workspace folder name, if any
workspace: string | null,
// Path of package names
path: string[]
}
/**
* Locates a package dependency searching first in the dependant package's node_modules and then
* backwards up the package path until we reach the root workspace's node_modules folder.
*
* @param lockfile
* @param dependantPackagePath The package path of the package that depends on the package being located
* @param packageName The name of the packge being located
* @returns The located package or undefined if it could not be found
*/
function locatePackageDependency(lockfile: Lockfile, dependantPackagePath: PackagePath, packageName: string): LocatedPackage | undefined {
const workspacePrefix = dependantPackagePath.workspace !== null ? dependantPackagePath.workspace + "/" : "";
function locate(packageSearchPath: PackagePath): LocatedPackage | undefined {
if (packageSearchPath.path.length === 0) {
const pkgPath = workspacePrefix + "node_modules/" + packageName;
const pkg = lockfile.packages[pkgPath];
return pkg !== undefined
? { dependantPackagePath,
packagePath: packageSearchPath,
package: pkg }
: packageSearchPath.workspace !== null
? locatePackageDependency(lockfile, { workspace: null, path: [] }, packageName) // If there's no package in the workspace, try the above the workspace
: undefined;
} else {
const pkgPath = workspacePrefix + "node_modules/" + packageSearchPath.path.join("/node_modules/") + "/node_modules/" + packageName;
const pkg = lockfile.packages[pkgPath];
return pkg !== undefined
? { dependantPackagePath,
packagePath: packageSearchPath,
package: pkg }
: locate({...packageSearchPath, path: packageSearchPath.path.slice(0, packageSearchPath.path.length - 1)});
}
}
return locate(dependantPackagePath);
}
/**
* Given the specified package and its package path, find all transitive dependencies.
*/
function collectDeps(lockfile: Lockfile, packagePath: PackagePath, pkg: Package): Dependency[] {
const deps =
[ ...(Object.keys(pkg.dependencies ?? {}).map<[string, boolean]>(d => [d, false])),
...(Object.keys(pkg.devDependencies ?? {}).map<[string, boolean]>(d => [d, false])),
...(Object.keys(pkg.peerDependencies ?? {})
.map<[string, boolean]>(peerDep => {
const optional = pkg.peerDependenciesMeta?.[peerDep]?.optional === true;
return [peerDep, optional];
}))
];
return deps.flatMap(([depPkgName, optional]) => {
const locatedDep = locatePackageDependency(lockfile, packagePath, depPkgName);
if (locatedDep === undefined) {
if (optional)
return [];
else
throw new Error(`Can't locate package '${depPkgName}'`);
} else if (locatedDep.package.link === true) {
if (locatedDep.package.resolved === undefined)
throw new Error(`Linked package '${depPkgName}' does not have a resolved property`);
const linkedPkg = { ...lockfile.packages[locatedDep.package.resolved] };
if (linkedPkg === undefined)
throw new Error(`Cannot file package '${locatedDep.package.resolved}' resolved from linked package '${depPkgName}'`);
delete linkedPkg.name;
const dependency = {
name: depPkgName,
package: {
...locatedDep,
package: linkedPkg
}
};
return [dependency, ...collectDeps(lockfile, {...locatedDep.packagePath, path: [...locatedDep.packagePath.path, depPkgName]}, linkedPkg)];
} else {
const dependency = {
name: depPkgName,
package: locatedDep
};
return [dependency, ...collectDeps(lockfile, {...locatedDep.packagePath, path: [...locatedDep.packagePath.path, depPkgName]}, locatedDep.package)];
}
});
}
/**
* Relocate a workspace's transitive dependencies from their current location to
* where they ought to live for the derived lockfile
*/
function relocateWorkspacePackagesToRoot(dependencies: Dependency[]) {
return dependencies.map(dep => {
if (dep.package.packagePath.workspace !== null) {
// If the package is found at both the root and directly within the workspace
// then we're going to have a conflict and the workspace package needs to pushed
// into the node_modules folder of the dependency one down from the workspace
// to prevent it from conflicting
if (dep.package.packagePath.path.length === 0) {
const conflictingPackageAlreadyAtRoot =
dependencies.find(d =>
d.name === dep.name
&& d.package.packagePath.workspace === null
&& d.package.packagePath.path.length === 0
);
if (conflictingPackageAlreadyAtRoot !== undefined) {
return {
...dep,
package: {
...dep.package,
packagePath: {
workspace: null,
path: dep.package.dependantPackagePath.path.slice(0, 1),
}
}
}
}
}
// Remove the workspace from the path to lift it to the root
return {
...dep,
package: {
...dep.package,
packagePath: {
...dep.package.packagePath,
workspace: null,
}
}
};
} else {
// Already at root
return dep;
}
});
}
/**
* Derive a lockfile for a workspace package from the lockfile at the root of the workspace
*
* @param lockfile The root lockfile
* @param workspace The workspace for which to derive a lockfile
* @returns The derived lockfile
*/
function deriveLockfile(lockfile: Lockfile, workspace: string): Lockfile {
const workspacePackage = lockfile.packages[workspace];
if (workspacePackage.name === undefined) {
throw new Error(`The workspace package ${workspace} is missing a name`);
}
const deps = collectDeps(lockfile, { workspace, path: [] }, workspacePackage);
const relocatedDeps = relocateWorkspacePackagesToRoot(deps);
const packages = Object.fromEntries(relocatedDeps.map<[string, Package]>(dep => {
return dep.package.packagePath.path.length === 0
? ["node_modules/" + dep.name, dep.package.package]
: ["node_modules/" + dep.package.packagePath.path.join("/node_modules/") + "/node_modules/" + dep.name, dep.package.package]
}));
return {
name: workspacePackage.name,
version: workspacePackage.version,
// We use version 3 because it is the same as version 2 (npm's current default)
// except without the backwards compatibility of having the "dependencies"
// property that is only used by old versions of npm that we don't use or
// care about. This way we don't need to rewrite that property too.
lockfileVersion: 3,
requires: true,
packages: {
"": workspacePackage,
...packages
}
};
}
const argParser = yargs
.option("lockfile", {
alias: "l",
describe: "The worktree lockfile",
type: "string"
})
.option("workspace", {
alias: "w",
describe: "The workspace you want to derive a lockfile for",
type: "string"
})
.array("workspace")
.demandOption(["lockfile", "workspace"])
.help();
(async () => {
const args = await argParser.argv;
console.log(`Reading lockfile '${args.lockfile}'...`);
const lockfile = JSON.parse(await fs.readFile(args.lockfile, "utf-8"));
for (const workspace of args.workspace) {
const outputFile = path.join(path.dirname(args.lockfile), `./${workspace}/package-lock.json`);
console.log(`Deriving lockfile for workspace '${workspace}'...`);
const workspaceLockfile = deriveLockfile(lockfile, workspace);
console.log(`Writing derived lockfile to '${outputFile}'...`);
await fs.writeFile(outputFile, JSON.stringify(workspaceLockfile, null, 2) + os.EOL, "utf-8");
}
console.log("Done deriving lockfile" + (args.workspace.length > 1 ? "s" : ""))
})();

View File

@ -0,0 +1,39 @@
#!/usr/bin/env bash
set -euo pipefail
PROJECT_ROOT="$( cd "$( dirname "${BASH_SOURCE[0]}" )/.." >/dev/null 2>&1 && pwd )" # ... https://stackoverflow.com/a/246128/176841
cd "$PROJECT_ROOT"
TYPES_PROJECT_DIR="./dc-api-types"
TYPES_DIR="$TYPES_PROJECT_DIR/src"
SCHEMA_FILE="$TYPES_DIR/agent.openapi.json"
mkdir -p $TYPES_DIR
if [ ! -f $SCHEMA_FILE ] ; then
echo "$SCHEMA_FILE does not exist, re-generating it using the agent test suite"
if [ -z "$TESTS_DC_API" ]; then
echo "Expected TEST_DC_API to be set to the path of the tests-dc-api executable"
exit 1
fi
$TESTS_DC_API export-openapi-spec | tail -n 1 | jq . > $SCHEMA_FILE
fi
echo "Deleting existing generated model..."
rm -rf "$TYPES_DIR/models"
rm -f "$TYPES_DIR/index.ts"
echo "Generating model from $SCHEMA_FILE..."
npx openapi --useUnionTypes --input "$SCHEMA_FILE" --output "$TYPES_DIR" --exportServices false --exportCore false --indent 2
cd "$TYPES_PROJECT_DIR"
if ! git diff package.json | grep "+ \"version\":" > /dev/null; then
echo "Bumping the minor version of dc-api-types..."
echo "NOTE: If you don't like the new number, change it in dc-api-types' package.json and then run 'make update-api-types-deps'"
npm version minor
../scripts/update-api-types-deps.sh
else
echo "Skipping dc-api-types version bump since it seems like it has already been changed"
fi

View File

@ -0,0 +1,25 @@
#!/usr/bin/env bash
set -euo pipefail
PROJECT_ROOT="$( cd "$( dirname "${BASH_SOURCE[0]}" )/.." >/dev/null 2>&1 && pwd )" # ... https://stackoverflow.com/a/246128/176841
cd "$PROJECT_ROOT"
TYPES_PROJECT_DIR="./dc-api-types"
PROJECT_DIR_NAMES=( "reference" "sqlite" )
TYPES_VERSION=$( jq '.version' "$TYPES_PROJECT_DIR/package.json" )
echo "Updating projects dependant on API types to version $TYPES_VERSION..."
for project in "${PROJECT_DIR_NAMES[@]}"; do
PROJECT_DIR="./$project"
echo "Updating $project..."
TMP_FILE="$( mktemp )"
jq ".dependencies[\"@hasura/dc-api-types\"] = $TYPES_VERSION" "$PROJECT_DIR/package.json" > "$TMP_FILE"
mv -f "$TMP_FILE" "$PROJECT_DIR/package.json"
done
npm install
make derive-lockfiles
echo "Done updating API types version in dependant projects"

View File

@ -1 +1 @@
lts/gallium
v16.17.0

File diff suppressed because it is too large Load Diff

View File

@ -1,7 +1,7 @@
{
"name": "dc-agent-reference",
"version": "1.0.0",
"description": "Reference implementation of a Data Connector Agent for Hasura GraphQL Engine",
"name": "@hasura/dc-agent-sqlite",
"version": "0.1.0",
"description": "SQLite Data Connector Agent for Hasura GraphQL Engine",
"author": "Hasura (https://github.com/hasura/graphql-engine)",
"license": "Apache-2.0",
"repository": {
@ -22,23 +22,23 @@
},
"dependencies": {
"@fastify/cors": "^8.1.0",
"fastify": "^4.4.0",
"@hasura/dc-api-types": "0.3.0",
"fastify-metrics": "^9.2.1",
"fastify": "^4.4.0",
"openapi3-ts": "^2.0.2",
"pino-pretty": "^8.1.0",
"sequelize": "^6.21.2",
"sqlite": "^4.1.1",
"sqlite-parser": "^1.0.1",
"sqlite": "^4.1.1",
"sqlite3": "^5.0.8",
"sqlstring-sqlite": "^0.1.1",
"xml2js": "^0.4.23",
"pino-pretty": "^8.1.0"
"sqlstring-sqlite": "^0.1.1"
},
"devDependencies": {
"@tsconfig/node16": "^1.0.2",
"@types/node": "^16.11.38",
"@tsconfig/node16": "^1.0.3",
"@types/node": "^16.11.49",
"@types/sqlite3": "^3.1.8",
"@types/xml2js": "^0.4.11",
"ts-node": "^10.8.1",
"ts-node": "^10.9.1",
"typescript": "^4.7.4"
}
}

View File

@ -1,5 +1,5 @@
import { configSchema } from "./config"
import { CapabilitiesResponse } from "./types"
import { CapabilitiesResponse } from "@hasura/dc-api-types"
import { envToBool } from "./util"
export const capabilitiesResponse: CapabilitiesResponse = {
@ -17,4 +17,3 @@ export const capabilitiesResponse: CapabilitiesResponse = {
... ( envToBool('METRICS') ? { metrics: {} } : {} )
},
}

View File

@ -1,5 +1,5 @@
import { FastifyRequest } from "fastify"
import { ConfigSchemaResponse } from "./types"
import { ConfigSchemaResponse } from "@hasura/dc-api-types"
export type Config = {
db: string,

View File

@ -1,10 +1,10 @@
import Fastify from 'fastify';
import Fastify from 'fastify';
import FastifyCors from '@fastify/cors';
import { getSchema } from './schema';
import { explain, queryData } from './query';
import { getConfig, tryGetConfig } from './config';
import { capabilitiesResponse } from './capabilities';
import { QueryResponse, SchemaResponse, QueryRequest, CapabilitiesResponse, ExplainResponse } from './types';
import { QueryResponse, SchemaResponse, QueryRequest, CapabilitiesResponse, ExplainResponse } from '@hasura/dc-api-types';
import { connect } from './db';
import { envToBool, envToString } from './util';
import metrics from 'fastify-metrics';
@ -80,7 +80,7 @@ const queryHistogram = new prometheus.Histogram({
name: 'query_durations',
help: 'Histogram of the duration of query response times.',
buckets: prometheus.exponentialBuckets(0.0001, 10, 8),
labelNames: ['route'],
labelNames: ['route'] as const,
});
const sqlLogger = (sql: string): void => {

View File

@ -1,4 +1,4 @@
import { Config } from "./config";
import { Config } from "./config";
import { connect, SqlLogger } from "./db";
import { coerceUndefinedToNull, omap, last, coerceUndefinedOrNullToEmptyRecord, envToBool, isEmptyObject, tableNameEquals, unreachable, logDeep } from "./util";
import {
@ -19,7 +19,7 @@ import {
OrderDirection,
UnaryComparisonOperator,
ExplainResponse,
} from "./types";
} from "@hasura/dc-api-types";
const SqlString = require('sqlstring-sqlite');

View File

@ -1,4 +1,4 @@
import { SchemaResponse, ScalarType, ColumnInfo, TableInfo, Constraint } from "./types"
import { SchemaResponse, ScalarType, ColumnInfo, TableInfo, Constraint } from "@hasura/dc-api-types"
import { Config } from "./config";
import { connect, SqlLogger } from './db';
import { logDeep } from "./util";

File diff suppressed because it is too large Load Diff

View File

@ -1,67 +0,0 @@
/* istanbul ignore file */
/* tslint:disable */
/* eslint-disable */
export type { Aggregate } from './models/Aggregate';
export type { AndExpression } from './models/AndExpression';
export type { AnotherColumnComparison } from './models/AnotherColumnComparison';
export type { ApplyBinaryArrayComparisonOperator } from './models/ApplyBinaryArrayComparisonOperator';
export type { ApplyBinaryComparisonOperator } from './models/ApplyBinaryComparisonOperator';
export type { ApplyUnaryComparisonOperator } from './models/ApplyUnaryComparisonOperator';
export type { BinaryArrayComparisonOperator } from './models/BinaryArrayComparisonOperator';
export type { BinaryComparisonOperator } from './models/BinaryComparisonOperator';
export type { BooleanOperators } from './models/BooleanOperators';
export type { Capabilities } from './models/Capabilities';
export type { CapabilitiesResponse } from './models/CapabilitiesResponse';
export type { ColumnCountAggregate } from './models/ColumnCountAggregate';
export type { ColumnField } from './models/ColumnField';
export type { ColumnFieldValue } from './models/ColumnFieldValue';
export type { ColumnInfo } from './models/ColumnInfo';
export type { ComparisonColumn } from './models/ComparisonColumn';
export type { ComparisonOperators } from './models/ComparisonOperators';
export type { ComparisonValue } from './models/ComparisonValue';
export type { ConfigSchemaResponse } from './models/ConfigSchemaResponse';
export type { Constraint } from './models/Constraint';
export type { ExplainCapabilities } from './models/ExplainCapabilities';
export type { ExplainResponse } from './models/ExplainResponse';
export type { Expression } from './models/Expression';
export type { Field } from './models/Field';
export type { FilteringCapabilities } from './models/FilteringCapabilities';
export type { MetricsCapabilities } from './models/MetricsCapabilities';
export type { MutationCapabilities } from './models/MutationCapabilities';
export type { NotExpression } from './models/NotExpression';
export type { NullColumnFieldValue } from './models/NullColumnFieldValue';
export type { OpenApiDiscriminator } from './models/OpenApiDiscriminator';
export type { OpenApiExternalDocumentation } from './models/OpenApiExternalDocumentation';
export type { OpenApiReference } from './models/OpenApiReference';
export type { OpenApiSchema } from './models/OpenApiSchema';
export type { OpenApiXml } from './models/OpenApiXml';
export type { OrderBy } from './models/OrderBy';
export type { OrderByColumn } from './models/OrderByColumn';
export type { OrderByElement } from './models/OrderByElement';
export type { OrderByRelation } from './models/OrderByRelation';
export type { OrderBySingleColumnAggregate } from './models/OrderBySingleColumnAggregate';
export type { OrderByStarCountAggregate } from './models/OrderByStarCountAggregate';
export type { OrderByTarget } from './models/OrderByTarget';
export type { OrderDirection } from './models/OrderDirection';
export type { OrExpression } from './models/OrExpression';
export type { Query } from './models/Query';
export type { QueryCapabilities } from './models/QueryCapabilities';
export type { QueryRequest } from './models/QueryRequest';
export type { QueryResponse } from './models/QueryResponse';
export type { Relationship } from './models/Relationship';
export type { RelationshipCapabilities } from './models/RelationshipCapabilities';
export type { RelationshipField } from './models/RelationshipField';
export type { RelationshipType } from './models/RelationshipType';
export type { ScalarType } from './models/ScalarType';
export type { ScalarValue } from './models/ScalarValue';
export type { ScalarValueComparison } from './models/ScalarValueComparison';
export type { SchemaResponse } from './models/SchemaResponse';
export type { SingleColumnAggregate } from './models/SingleColumnAggregate';
export type { SingleColumnAggregateFunction } from './models/SingleColumnAggregateFunction';
export type { StarCountAggregate } from './models/StarCountAggregate';
export type { SubscriptionCapabilities } from './models/SubscriptionCapabilities';
export type { TableInfo } from './models/TableInfo';
export type { TableName } from './models/TableName';
export type { TableRelationships } from './models/TableRelationships';
export type { UnaryComparisonOperator } from './models/UnaryComparisonOperator';

Some files were not shown because too many files have changed in this diff Show More