This teaches `hge_server` how to run more tests, thanks to `hge_env`.
It also simplifies the logic a bit more.
I have also modified _run.sh_ and _docker-compose.yml_ so we can run multiple test suites, one after another.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/6105
GitOrigin-RevId: eff009362eb6bb90c07cedaf96dfe6ec9336ff32
If we don't do this, we might end up applying metadata with a stale schema cache.
Following the principle of least surprise, replacing the metadata should probably compute inconsistencies with regards to the actual state of the database.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/6026
GitOrigin-RevId: ff7469d7d9857c8a9f517d5d0b6f1ecf463621b3
This has two purposes:
* When running the Python integration tests against a running HGE instance, with `--hge-url`, it will check the environment variables available and actively skip the test if they aren't set. This replaces the previous ad-hoc skip behavior.
* More interestingly, when running against a binary with `--hge-bin`, the environment variables are passed through, which means different tests can run with different environment variables.
On top of this, the various services we use for testing now also provide their own environment variables, rather than expecting a test script to do it.
In order to make this work, I also had to invert the dependency between various services and `hge_ctx`. I extracted a `pg_version` fixture to provide the PostgreSQL version, and now pass the `hge_url` and `hge_key` explicitly to `ActionsWebhookServer`.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/6028
GitOrigin-RevId: 16d866741dba5887da1adf4e1ade8182ccc9d344
NPM v7 uses a new (backwards-compatible) lockfile format. This upgrades all our various _package-lock.json_ files to use the new format.
It's much more verbose so that NPM can be a lot faster.
I figured it was cleaner to do this once in a separate PR rather than upgrading them in combination with adding or upgrading a new dependency.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/5869
GitOrigin-RevId: 322fb63b96e2d873a4a3cc05fa6c7afa414716ce
This adds support for running the Python integration tests for MSSQL and Citus just as in CI, as follows:
```
./server/tests-py/run.sh backend-mssql
./server/tests-py/run.sh backend-citus
```
These run the named CI jobs, providing the appropriate backend.
(In reality, all backends are always provided, which is much simpler.)
It also provides the various databases to _server/tests-py/run-new.sh_, though the tests fail as they don't properly initialize the sources. (This will be fixed in the future by provisioning sources in the test framework itself.)
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/5997
GitOrigin-RevId: c276a4779a35bb538ef0dc02ac8b7cb2d5a8dec5
This makes a few changes to the test scripts and makefiles in order to make things simpler for the average Apple user.
First of all, we change the `wait_for_mysql` function to use "localhost", not "127.0.0.1", as this fixed an issue on my system when attempting to connect to the MySQL server.
Secondly, we split the SQL Server test image into two:
* The first is the server itself, which now automatically uses `azure-sql-edge` as the image if you are on an aarch64 chip and using the `make` commands.
* The second is the initialization script. Because `sqlcmd` is not available in the `azure-sql-edge` image on aarch64, we use a separate container based on `mssql-tools` to initialize the server.
The README has been updated.
Tested on both macOS/aarch64 (with other changes) and Linux/x86_64.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/5986
GitOrigin-RevId: b16e079861dcbcc66773295c47d715e443b67eea
See: https://github.com/grafana/k6/issues/2685
It might be interesting to think about taking into consideration decompression time when thinking about performance, but In general I think doing so is surprising and I wasted a lot of time trying to figure out why my optimizations to the compression codepath weren't improving things to the degree I expected
The downside here is we lose error reporting, so you'll need to only set
discardResponseBodies: true after the query has been tested.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/5940
GitOrigin-RevId: 82a589a59b93f10ffb5391e4a3190459fb6e613b
Result of executing the following commands:
```shell
# replace "as Q" imports with "as PG" (in retrospect this didn't need a regex)
git grep -lE 'as Q($|[^a-zA-Z])' -- '*.hs' | xargs sed -i -E 's/as Q($|[^a-zA-Z])/as PG\1/'
# replace " Q." with " PG."
git grep -lE ' Q\.' -- '*.hs' | xargs sed -i 's/ Q\./ PG./g'
# replace "(Q." with "(PG."
git grep -lE '\(Q\.' -- '*.hs' | xargs sed -i 's/(Q\./(PG./g'
# ditto, but for [, |, { and !
git grep -lE '\[Q\.' -- '*.hs' | xargs sed -i 's/\[Q\./\[PG./g'
git grep -l '|Q\.' -- '*.hs' | xargs sed -i 's/|Q\./|PG./g'
git grep -l '{Q\.' -- '*.hs' | xargs sed -i 's/{Q\./{PG./g'
git grep -l '!Q\.' -- '*.hs' | xargs sed -i 's/!Q\./!PG./g'
```
(Doing the `grep -l` before the `sed`, instead of `sed` on the entire codebase, reduces the number of `mtime` updates, and so reduces how many times a file gets recompiled while checking intermediate results.)
Finally, I manually removed a broken and unused `Arbitrary` instance in `Hasura.RQL.Network`. (It used an `import Test.QuickCheck.Arbitrary as Q` statement, which was erroneously caught by the first find-replace command.)
After this PR, `Q` is no longer used as an import qualifier. That was not the goal of this PR, but perhaps it's a useful fact for future efforts.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/5933
GitOrigin-RevId: 8c84c59d57789111d40f5d3322c5a885dcfbf40e
This fixes a few issues so that we can run `./server/tests-py/run.sh backend-bigquery` to run the Python integration tests for BigQuery locally.
* We forward the relevant environment variables to the Docker container.
* We increase the HTTP timeout, as I'm seeing requests taking up to 90s locally.
* We rewrite the setup so that it avoids `INSERT INTO`, which is not available using the BigQuery free tier. Instead, we use `CREATE TABLE ... AS SELECT ...`. This is the same method used by the Haskell integration tests.
We also capture local server output in a volume so it's easier to figure out what went wrong later.
PR-URL: https://github.com/hasura/graphql-engine-mono/pull/5921
GitOrigin-RevId: c628f8c08a84f2582958659ab6d6494832471f6f