* Rename `global_hash` to `agg_tiles_hash`
This is still a big sticking point: what should be the name for the
metadata key for this value? The value represents the hash of all
`z,x,y,tile` over all rows of the `tiles` table (or view). Should it
include `md5` in its name, or should the hash be auto-detected by its
length? (details in #856)
* Generate it based on `tiles` table/view
* validate or generate, but not both (it will always fail otherwise)
* break up logic for per-tile, total, and integrity checks
* delete unused sqlx prep file
Modify `/catalog` endpoint to return an object instead of a list. This
allows future expansion of the catalog schema, e.g. adding new types of
data.
The new schema:
```yaml
{
"tiles" {
"function_zxy_query": {
"name": "public.function_zxy_query",
"content_type": "application/x-protobuf"
},
"points1": {
"name": "public.points1.geom",
"content_type": "image/webp"
},
...
},
}
```
If a PostgreSQL function has an SQL comment, it will try to parse as
JSON and use its values to override the auto-generated TileJSON. It is
recommended to use this form when creating comments to ensure valid JSON
values.
```sql
DO $do$ BEGIN
EXECUTE 'COMMENT ON FUNCTION YOUR_FUNCTION (ARG1_TYPE,ARG2_TYPE,..ARGN_TYPE) IS $tj$' || $$
{
"description": "description override",
...
}
$$::json || '$tj$';
END $do$;
```
Partially implements #822
* Add `MbtType::FlatWithHash`
* Support copying, diffing and applying diffs to and from any
`MbtTypes`s
* Support validating tile data if hash is contained in `*.mbtiles` file
(i.e it is of `MbtType::FlatWithHash` or `MbtType::Normalized`)
---------
Co-authored-by: rstanciu <rstanciu@rivian.com>
Co-authored-by: Yuri Astrakhan <yuriastrakhan@gmail.com>
Resolves#682
- [x] Get id_column string from config.yaml and use for id column
- [x] Support for list of strings
- [x] Add info/warnings if column is not there or is of wrong type
- [x] if column for the feature ID is found, remove it from properties
(see inline comment)
- [x] cleanup logging messages
- [x] need more tests to catch other edge cases
---------
Co-authored-by: Yuri Astrakhan <YuriAstrakhan@gmail.com>
* on `--save-config`, only save configured `auto_publish` settings
* alias `from_schemas` as `from_schema`
* add integration testing for `auto_publish`
* if integration test DB preloading fails, try to clean up the test DB
* A few more info traces
This change should benefit testing of the #790 cc: @Binabh
* Add ability to generate diff file by specifying `--diff-with-file` to
the `copy` tool
---------
Co-authored-by: Yuri Astrakhan <YuriAstrakhan@gmail.com>
* [BREAKING] Use source ID (table name) as the default layer ID, instead
of `schema.table.column`
* Add support for the optional `layer_id` table config parameter
Fix#595
* make tilejson's `name` be the same as the ID of the source (even if
aliased)
* `/catalog` will always show ID, but now it will hide the `name` if it
is the same as the `id`
* make `description` be the longer version, e.g. `public.table.column`
format - not guaranteed to be stable
* make `vector_layers` have the fields auto-discovered in the PG table
* preserve the order of the serialized json fields
Fixes#583
Compression middleware turned out to be hard to use for image cases - it
simply looks at the content-encoding, and if not set, tries to compress
if accepted by the client.
Instead, now individual routes are configured with either that
middleware, or for tiles, I decompress and optionally recompress if
applicable.
Now encoding is tracked separately from the tile content, making it
cleaner too. Plus lots of tests for mbtiles & pmtiles.
Fixes#577
* Adds a view to `points1.sql` fixture
* Replaces `table` with `view` in log statements relating to views
---------
Co-authored-by: Chris Thiange <cthiange@gmail.com>
* clean up reporting of the un-used config params - instead of printing,
collect them and print in one place if needed (allows testing too)
* remove `vector_layer` in catalog - too verbose, not needed - can be
received via tilejson for individual source
* clean up tests so that they all use the same config yaml
Adds a new [.mbtiles](https://github.com/mapbox/mbtiles-spec/blob/master/1.3/spec.md)
backend, without the grid support. Uses extensive tile content
detection, i.e. if the content is gzipped, png, jpeg, gif, webp.
From CLI, can be as easy as adding a path to a directory that contains a
.mbtiles file (works just like pmtiles support)
```bash
# All *.mbtiles files in this dir will be published.
# The filename will be used as the source ID
martin ./tests/fixtures
```
From configuration file, the path can be specified in a number of ways
(same as pmtiles)
```yaml
mbtiles:
paths:
# scan this whole dir, matching all *.mbtiles files
- /dir-path
# specific mbtiles file will be published as mbtiles2 source
- /path/to/mbtiles2.mbtiles
sources:
# named source matching source name to a single file
pm-src1: /tmp/mbtiles.mbtiles
# named source, where the filename is explicitly set. This way we will be able to add more options later
pm-src2:
path: /tmp/mbtiles.mbtiles
```
Fixes#494
Merge after #548
Adds a new [.pmtiles](https://protomaps.com/docs/pmtiles/) backend.
Supports all formats like png, vector, etc.
From CLI, can be as easy as adding a path to a directory that contains a
.pmtiles file:
```bash
# All *.pmtiles files in this dir will be published.
# The filename will be used as the source ID
martin ./tests/fixtures
```
From configuration file, the path can be specified in a number of ways:
```yaml
pmtiles:
paths:
# scan this whole dir, matching all *.pmtiles files
- /dir-path
# specific pmtiles file will be published as pmtiles2 source
- /path/to/pmtiles2.pmtiles
sources:
# named source matching source name to a single file
pm-src1: /tmp/pmtiles.pmtiles
# named source, where the filename is explicitly set. This way we will be able to add more options later
pm-src2:
path: /tmp/pmtiles.pmtiles
```
Fixes#508
* introduce a new Connections object to track all positional strings
passed as the CLI arguments
* each tile provider can now indicate if it can take a positional CLI
arg, and if the value can be shared between multiple providers, i.e. if
its a directory that could contain files for multiple providers
* make xyz use better types - u8 for zoom, u32 for x&y. Postgres casts
those to INT2 and INT8
* minor bug in pre-push git hook to abort in case of a testingerror
* added GIF detection/type
* combine MVT and compression concepts into one enum more explicitly. It
is not ideal (technically they are separate concerns), but it keeps it a
bit simpler for now for multiple providers.
* set content encoding and content type on HTTP responses if known, and
also include them in the `/catalog` response (json)
* raise an error if the user attempts to merge non-concatenatable tiles
from multiple sources. We may want to implement it in the future, e.g.
combine multiple semi-transparent PNGs. Or even combine GIF & PNG & JPEG
* do not set content-type on empty responses (http 204)
* add tilejson outputs to testing
* fixed SQL to work on older PG versions
* re-enable CI to test expected `test.sh` output against the one stored
in the `tests/expected`
* add postgres in docker tests on linux - one for the oldest supported
DB, and another using the more recent version
* minor justfile cleanup
* ensure config files are sorted alphabetically
Rework CI to run tests locally using the VM-installed Postgres on all
target platforms.
### CI jobs
* Build release versions on Linux/Win/Mac and save build results as
output artifacts
* In a separate VMs (Linux/Win/Mac)
* use
[nyurik/action-setup-postgis](https://github.com/nyurik/action-setup-postgis)
to install postgis and run tests using the built artifacts
* run `cargo test` on Linux only
* copy built artifacts from the build step, and run tests using the
release martin binary
* package and publish if this is a release
### Other changes
* Port some minor changes from the rewrite to porting easier
* minor cleanups
* remove all "expected" data files - too unstable to be usable
* Add justfile to simplify running all the tests
* Save all PBF outputs to the text files
* Consolidate all tests to reuse the same code
* Consolidate database initialization
* updated readme with the new instructions
Note that while this PR creates "expected" files, the CI cannot validate
the generated results because the output is not stable. Eventually we
may try to output just the non-geometry values to have reasonable tests
comparing against the expected results.