Improve Coverage tooling (#535)

* added manual coverage justfile command
* a lot of small refactorings of config and argument parsing
* feature: support jsonb query param for functions
* cleaned up public/private access
* make all tests populate with a predefined values to avoid issues with
random data
This commit is contained in:
Yuri Astrakhan 2022-12-27 01:56:27 -05:00 committed by GitHub
parent 3a713a0269
commit 555a1fccdd
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
48 changed files with 1425 additions and 836 deletions

View File

@ -13,6 +13,7 @@ justfile
**/*.rs.bk **/*.rs.bk
.idea/ .idea/
test_log* test_log*
*.profraw
pg_data/ pg_data/
config.yml config.yml

1
.gitignore vendored
View File

@ -5,6 +5,7 @@
**/*.rs.bk **/*.rs.bk
.idea/ .idea/
test_log* test_log*
*.profraw
pg_data/ pg_data/
config.yml config.yml

View File

@ -1,4 +1,5 @@
[workspace] [workspace]
default-members = ["martin-tile-utils", "."]
members = ["martin-tile-utils"] members = ["martin-tile-utils"]
[package] [package]

View File

@ -318,7 +318,7 @@ curl localhost:3000/points,lines/0/0/0
## Function Sources ## Function Sources
Function Source is a database function which can be used to query [vector tiles](https://github.com/mapbox/vector-tile-spec). When started, martin will look for the functions with a suitable signature. A function that takes `z integer` (or `zoom integer`), `x integer`, `y integer`, and an optional `query json` and returns `bytea`, can be used as a Function Source. Alternatively the function could return a record with a single `bytea` field, or a record with two fields of types `bytea` and `text`, where the `text` field is a etag key (i.e. md5 hash). Function Source is a database function which can be used to query [vector tiles](https://github.com/mapbox/vector-tile-spec). When started, martin will look for the functions with a suitable signature. A function that takes `z integer` (or `zoom integer`), `x integer`, `y integer`, and an optional `query json` and returns `bytea`, can be used as a Function Source. Alternatively the function could return a record with a single `bytea` field, or a record with two fields of types `bytea` and `text`, where the `text` field is an etag key (i.e. md5 hash).
| Argument | Type | Description | | Argument | Type | Description |
|----------------------------|---------|-------------------------| |----------------------------|---------|-------------------------|
@ -448,18 +448,18 @@ Options:
## Environment Variables ## Environment Variables
You can also configure martin using environment variables You can also configure martin using environment variables, but only if the configuration file is not used. See [configuration section](#configuration-file) on how to use environment variables with config files.
| Environment variable | Example | Description | | Environment variable | Example | Description |
|-------------------------------|--------------------------------------|---------------------------------------------| |-------------------------------|--------------------------------------|---------------------------------------------|
| `DATABASE_URL` | `postgresql://postgres@localhost/db` | Postgres database connection | | `DATABASE_URL` | `postgresql://postgres@localhost/db` | Postgres database connection |
| `CA_ROOT_FILE` | `./ca-certificate.crt` | Loads trusted root certificates from a file | | `CA_ROOT_FILE` | `./ca-certificate.crt` | Loads trusted root certificates from a file |
| `DEFAULT_SRID` | `4326` | Fallback SRID | | `DEFAULT_SRID` | `4326` | Fallback SRID |
| `DANGER_ACCEPT_INVALID_CERTS` | `false` | Trust invalid certificates | | `DANGER_ACCEPT_INVALID_CERTS` | `0` | Trust invalid certificates (any value) |
## Configuration File ## Configuration File
If you don't want to expose all of your tables and functions, you can list your sources in a configuration file. To start martin with a configuration file you need to pass a path to a file with a `--config` argument. If you don't want to expose all of your tables and functions, you can list your sources in a configuration file. To start martin with a configuration file you need to pass a path to a file with a `--config` argument. Config files may contain environment variables, which will be expanded before parsing. For example, to use `MY_DATABASE_URL` in your config file: `connection_string: ${MY_DATABASE_URL}`, or with a default `connection_string: ${MY_DATABASE_URL:-postgresql://postgres@localhost/db}`
```shell ```shell
martin --config config.yaml martin --config config.yaml
@ -786,7 +786,7 @@ Available recipes:
psql *ARGS # Run PSQL utility against the test database psql *ARGS # Run PSQL utility against the test database
clean # Perform cargo clean to delete all build files clean # Perform cargo clean to delete all build files
clean-test # Delete test output files clean-test # Delete test output files
start-db # Start a test database start # Start a test database
start-legacy # Start a legacy test database start-legacy # Start a legacy test database
docker-up name # Start a specific test database, e.g. db or db-legacy docker-up name # Start a specific test database, e.g. db or db-legacy
stop # Stop the test database stop # Stop the test database
@ -796,9 +796,11 @@ Available recipes:
test-int # Run integration tests test-int # Run integration tests
test-int-legacy # Run integration tests using legacy database test-int-legacy # Run integration tests using legacy database
test-integration name # Run integration tests with the given docker compose target test-integration name # Run integration tests with the given docker compose target
coverage FORMAT='html' # Run code coverage on tests and save its output in the coverage directory. Parameter could be html or lcov.
docker-build # Build martin docker image docker-build # Build martin docker image
docker-run *ARGS # Build and run martin docker image docker-run *ARGS # Build and run martin docker image
git *ARGS # Do any git command, ensuring that the testing environment is set up. Accepts the same arguments as git. git *ARGS # Do any git command, ensuring that the testing environment is set up. Accepts the same arguments as git.
lint # Run cargo fmt and cargo clippy
git-pre-push # These steps automatically run before git push via a git hook git-pre-push # These steps automatically run before git push via a git hook
``` ```

View File

@ -10,16 +10,16 @@ export CARGO_TERM_COLOR := "always"
just --list --unsorted just --list --unsorted
# Start Martin server and a test database # Start Martin server and a test database
run *ARGS: start-db run *ARGS: start
cargo run -- {{ARGS}} cargo run -- {{ARGS}}
# Start Martin server and open a test page # Start Martin server and open a test page
debug-page *ARGS: start-db debug-page *ARGS: start
open tests/debug.html # run will not exit, so open debug page first open tests/debug.html # run will not exit, so open debug page first
just run {{ARGS}} just run {{ARGS}}
# Run PSQL utility against the test database # Run PSQL utility against the test database
psql *ARGS: start-db psql *ARGS: start
psql {{ARGS}} {{DATABASE_URL}} psql {{ARGS}} {{DATABASE_URL}}
# Perform cargo clean to delete all build files # Perform cargo clean to delete all build files
@ -31,7 +31,7 @@ clean-test:
rm -rf tests/output rm -rf tests/output
# Start a test database # Start a test database
start-db: (docker-up "db") start: (docker-up "db")
# Start a legacy test database # Start a legacy test database
start-legacy: (docker-up "db-legacy") start-legacy: (docker-up "db-legacy")
@ -48,14 +48,14 @@ stop:
docker-compose down docker-compose down
# Run benchmark tests # Run benchmark tests
bench: start-db bench: start
cargo bench cargo bench
# Run all tests using a test database # Run all tests using a test database
test: test-unit test-int test: test-unit test-int
# Run Rust unit and doc tests (cargo test) # Run Rust unit and doc tests (cargo test)
test-unit *ARGS: start-db test-unit *ARGS: start
cargo test --all-targets {{ARGS}} cargo test --all-targets {{ARGS}}
cargo test --all-targets --all-features {{ARGS}} cargo test --all-targets --all-features {{ARGS}}
cargo test --doc cargo test --doc
@ -71,7 +71,6 @@ test-int-legacy: (test-integration "db-legacy")
#!/usr/bin/env sh #!/usr/bin/env sh
export MARTIN_PORT=3111 export MARTIN_PORT=3111
tests/test.sh tests/test.sh
# echo "** Skipping comparison with the expected values - not yet stable"
#if ( ! diff --brief --recursive --new-file tests/output tests/expected ); then #if ( ! diff --brief --recursive --new-file tests/output tests/expected ); then
# echo "** Expected output does not match actual output" # echo "** Expected output does not match actual output"
# echo "** If this is expected, run 'just bless' to update expected output" # echo "** If this is expected, run 'just bless' to update expected output"
@ -79,11 +78,62 @@ test-int-legacy: (test-integration "db-legacy")
#fi #fi
## Run integration tests and save its output as the new expected output ## Run integration tests and save its output as the new expected output
# bless: start-db clean-test #bless: start clean-test
# tests/test.sh # tests/test.sh
# rm -rf tests/expected # rm -rf tests/expected
# mv tests/output tests/expected # mv tests/output tests/expected
# Run code coverage on tests and save its output in the coverage directory. Parameter could be html or lcov.
coverage FORMAT='html':
#!/usr/bin/env bash
set -euo pipefail
if ! command -v grcov &> /dev/null; then \
echo "grcov could not be found. Installing..." ;\
cargo install grcov ;\
fi
if ! rustup component list | grep llvm-tools-preview &> /dev/null; then \
echo "llvm-tools-preview could not be found. Installing..." ;\
rustup component add llvm-tools-preview ;\
fi
just clean
just start
PROF_DIR=target/prof
mkdir -p "$PROF_DIR"
PROF_DIR=$(realpath "$PROF_DIR")
OUTPUT_RESULTS_DIR=target/coverage/{{FORMAT}}
mkdir -p "$OUTPUT_RESULTS_DIR"
export CARGO_INCREMENTAL=0
export RUSTFLAGS=-Cinstrument-coverage
# Avoid problems with relative paths
export LLVM_PROFILE_FILE=$PROF_DIR/cargo-test-%p-%m.profraw
export MARTIN_PORT=3111
cargo test --all-targets
cargo test --all-targets --all-features
tests/test.sh
set -x
grcov --binary-path ./target/debug \
-s . \
-t {{FORMAT}} \
--branch \
--ignore 'benches/*' \
--ignore 'tests/*' \
--ignore-not-existing \
-o target/coverage/{{FORMAT}} \
--llvm \
"$PROF_DIR"
{ set +x; } 2>/dev/null
# if this is html, open it in the browser
if [ "{{FORMAT}}" = "html" ]; then
open "$OUTPUT_RESULTS_DIR/index.html"
fi
# Build martin docker image # Build martin docker image
docker-build: docker-build:
docker build -t martin . docker build -t martin .
@ -94,7 +144,7 @@ docker-run *ARGS:
# Do any git command, ensuring that the testing environment is set up. Accepts the same arguments as git. # Do any git command, ensuring that the testing environment is set up. Accepts the same arguments as git.
[no-exit-message] [no-exit-message]
git *ARGS: start-db git *ARGS: start
git {{ARGS}} git {{ARGS}}
# Run cargo fmt and cargo clippy # Run cargo fmt and cargo clippy
@ -103,7 +153,7 @@ lint:
cargo clippy --all-targets --all-features -- -D warnings -W clippy::pedantic cargo clippy --all-targets --all-features -- -D warnings -W clippy::pedantic
# These steps automatically run before git push via a git hook # These steps automatically run before git push via a git hook
git-pre-push: stop start-db git-pre-push: stop start
rustc --version rustc --version
cargo --version cargo --version
just lint just lint

View File

@ -70,9 +70,10 @@ impl DataFormat {
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*;
use std::fs::read; use std::fs::read;
use super::*;
#[test] #[test]
fn test_data_format_png() { fn test_data_format_png() {
assert_eq!( assert_eq!(

View File

@ -1,43 +1,58 @@
use log::warn; use std::cell::RefCell;
use std::collections::HashSet;
use std::env::var_os;
use std::ffi::OsString; use std::ffi::OsString;
use log::warn;
use subst::VariableMap;
/// A simple wrapper for the environment var access, /// A simple wrapper for the environment var access,
/// so we can mock it in tests. /// so we can mock it in tests.
pub trait Env { pub trait Env {
fn var_os(&self, key: &str) -> Option<OsString>; fn var_os(&self, key: &str) -> Option<OsString>;
#[must_use] #[must_use]
fn get_env_str(&self, name: &str) -> Option<String> { fn get_env_str(&self, key: &str) -> Option<String> {
match self.var_os(name) { match self.var_os(key) {
Some(s) => match s.into_string() { Some(s) => {
match s.into_string() {
Ok(v) => Some(v), Ok(v) => Some(v),
Err(v) => { Err(v) => {
let v = v.to_string_lossy(); let v = v.to_string_lossy();
warn!("Environment variable {name} has invalid unicode. Lossy representation: {v}"); warn!("Environment variable {key} has invalid unicode. Lossy representation: {v}");
None None
} }
}, }
}
None => None, None => None,
} }
} }
/// Return true if the environment variable exists, and it was no used by the substitution process.
#[must_use]
fn has_unused_var(&self, key: &str) -> bool;
} }
/// A map that gives strings from the environment,
/// but also keeps track of which variables were requested via the `VariableMap` trait.
#[derive(Default)] #[derive(Default)]
pub struct SystemEnv; pub struct OsEnv(RefCell<HashSet<String>>);
impl Env for SystemEnv { impl Env for OsEnv {
fn var_os(&self, key: &str) -> Option<OsString> { fn var_os(&self, key: &str) -> Option<OsString> {
std::env::var_os(key) std::env::var_os(key)
} }
fn has_unused_var(&self, key: &str) -> bool {
!self.0.borrow().contains(key) && var_os(key).is_some()
}
} }
#[cfg(test)] impl<'a> VariableMap<'a> for OsEnv {
#[derive(Default)] type Value = String;
pub struct FauxEnv(std::collections::HashMap<&'static str, &'static str>);
#[cfg(test)] fn get(&'a self, key: &str) -> Option<Self::Value> {
impl Env for FauxEnv { self.0.borrow_mut().insert(key.to_string());
fn var_os(&self, key: &str) -> Option<OsString> { std::env::var(key).ok()
self.0.get(key).map(Into::into)
} }
} }

View File

@ -1,141 +1,7 @@
use crate::args::environment::{Env, SystemEnv}; mod environment;
use crate::args::pg::{parse_pg_args, PgArgs}; mod pg;
use crate::args::srv::SrvArgs; mod root;
use crate::config::Config; mod srv;
use crate::srv::config::SrvConfig;
use crate::{Error, Result};
use clap::Parser;
use log::warn;
use std::env;
use std::path::PathBuf;
pub mod environment; pub use environment::{Env, OsEnv};
pub mod pg; pub use root::Args;
pub mod srv;
#[derive(Parser, Debug, PartialEq, Default)]
#[command(about, version)]
pub struct Args {
#[command(flatten)]
pub meta: MetaArgs,
#[command(flatten)]
pub srv: SrvArgs,
#[command(flatten)]
pub pg: Option<PgArgs>,
}
// None of these params will be transferred to the config
#[derive(Parser, Debug, Clone, PartialEq, Default)]
#[command(about, version)]
pub struct MetaArgs {
// config may need a conflicts_with = "SourcesArgs"
// see https://github.com/clap-rs/clap/discussions/4562
/// Path to config file. If set, no tile source-related parameters are allowed.
#[arg(short, long)]
pub config: Option<PathBuf>,
/// Save resulting config to a file or use "-" to print to stdout.
/// By default, only print if sources are auto-detected.
#[arg(long)]
pub save_config: Option<PathBuf>,
/// [Deprecated] Scan for new sources on sources list requests
#[arg(short, long, hide = true)]
pub watch: bool,
/// Database connection strings
pub connection: Vec<String>,
}
impl TryFrom<Args> for Config {
type Error = Error;
fn try_from(args: Args) -> Result<Self> {
parse_args(&SystemEnv::default(), args)
}
}
fn parse_args(env: &impl Env, args: Args) -> Result<Config> {
if args.meta.watch {
warn!("The --watch flag is no longer supported, and will be ignored");
}
if env::var_os("WATCH_MODE").is_some() {
warn!("The WATCH_MODE environment variable is no longer supported, and will be ignored");
}
if args.meta.config.is_some() {
if args.pg.is_some() || !args.meta.connection.is_empty() {
return Err(Error::ConfigAndConnectionsError);
}
return Ok(Config {
srv: SrvConfig::from(args.srv),
..Default::default()
});
}
let pg = args.pg.unwrap_or_default();
Ok(Config {
srv: SrvConfig::from(args.srv),
postgres: parse_pg_args(env, &pg, &args.meta.connection),
..Default::default()
})
}
#[cfg(test)]
mod tests {
use super::*;
use crate::args::environment::FauxEnv;
fn parse(args: &[&str]) -> Result<(Config, MetaArgs)> {
let args = Args::parse_from(args);
let meta = args.meta.clone();
parse_args(&FauxEnv::default(), args).map(|v| (v, meta))
}
#[test]
fn cli_no_args() {
let args = parse(&["martin"]).unwrap();
let expected = (Config::default(), MetaArgs::default());
assert_eq!(args, expected);
}
#[test]
fn cli_with_config() {
let args = parse(&["martin", "--config", "c.toml"]).unwrap();
let meta = MetaArgs {
config: Some(PathBuf::from("c.toml")),
..Default::default()
};
assert_eq!(args, (Config::default(), meta));
let args = parse(&["martin", "--config", "c.toml", "--save-config", "s.toml"]).unwrap();
let meta = MetaArgs {
config: Some(PathBuf::from("c.toml")),
save_config: Some(PathBuf::from("s.toml")),
..Default::default()
};
assert_eq!(args, (Config::default(), meta));
let args = parse(&["martin", "connection"]).unwrap();
let meta = MetaArgs {
connection: vec!["connection".to_string()],
..Default::default()
};
assert_eq!(args, (Config::default(), meta));
}
#[test]
fn cli_bad_arguments() {
for params in [
["martin", "--config", "c.toml", "--tmp"].as_slice(),
["martin", "--config", "c.toml", "-c", "t.toml"].as_slice(),
] {
let res = Args::try_parse_from(params);
assert!(res.is_err(), "Expected error, got: {res:?} for {params:?}");
}
}
#[test]
fn cli_bad_parsed_arguments() {
let args = Args::parse_from(["martin", "--config", "c.toml", "connection"]);
let err = parse_args(&FauxEnv::default(), args).unwrap_err();
assert!(matches!(err, Error::ConfigAndConnectionsError));
}
}

View File

@ -1,11 +1,9 @@
use log::{info, warn};
use crate::args::environment::Env; use crate::args::environment::Env;
use crate::one_or_many::OneOrMany; use crate::args::root::MetaArgs;
use crate::pg::config; use crate::pg::{PgConfig, POOL_SIZE_DEFAULT};
use crate::pg::config::PgConfig; use crate::utils::OneOrMany;
use crate::pg::pool::POOL_SIZE_DEFAULT;
use itertools::Itertools;
use log::warn;
use std::collections::BTreeSet;
#[derive(clap::Args, Debug, PartialEq, Default)] #[derive(clap::Args, Debug, PartialEq, Default)]
#[command(about, version)] #[command(about, version)]
@ -25,57 +23,264 @@ pub struct PgArgs {
pub pool_size: Option<u32>, pub pool_size: Option<u32>,
} }
#[must_use] impl PgArgs {
pub fn parse_pg_args( pub fn into_config(self, meta: &mut MetaArgs, env: &impl Env) -> Option<OneOrMany<PgConfig>> {
env: &impl Env, let connections = Self::extract_conn_strings(meta, env);
args: &PgArgs, let default_srid = self.get_default_srid(env);
cli_strings: &[String], #[cfg(feature = "ssl")]
) -> Option<OneOrMany<PgConfig>> { let ca_root_file = self.get_ca_root_file(env);
let mut strings = cli_strings #[cfg(feature = "ssl")]
.iter() let danger_accept_invalid_certs = self.get_accept_invalid_cert(env);
.filter(|s| config::is_postgresql_string(s))
.map(std::string::ToString::to_string)
.unique()
.collect::<BTreeSet<_>>();
if let Some(s) = env.get_env_str("DATABASE_URL") { let results: Vec<_> = connections
if config::is_postgresql_string(&s) {
strings.insert(s);
} else {
warn!("Environment variable DATABASE_URL is not a postgres connection string");
}
}
let builders: Vec<_> = strings
.into_iter() .into_iter()
.map(|s| PgConfig { .map(|s| PgConfig {
connection_string: Some(s), connection_string: Some(s),
#[cfg(feature = "ssl")] #[cfg(feature = "ssl")]
ca_root_file: args ca_root_file: ca_root_file.clone(),
.ca_root_file
.clone()
.or_else(|| env.var_os("CA_ROOT_FILE").map(std::path::PathBuf::from)),
#[cfg(feature = "ssl")] #[cfg(feature = "ssl")]
danger_accept_invalid_certs: args.danger_accept_invalid_certs danger_accept_invalid_certs,
|| env.get_env_str("DANGER_ACCEPT_INVALID_CERTS").is_some(), default_srid,
default_srid: args.default_srid.or_else(|| { pool_size: self.pool_size,
..Default::default()
})
.collect();
match results.len() {
0 => None,
1 => Some(OneOrMany::One(results.into_iter().next().unwrap())),
_ => Some(OneOrMany::Many(results)),
}
}
pub fn override_config(self, pg_config: &mut OneOrMany<PgConfig>, env: &impl Env) {
if self.default_srid.is_some() {
info!("Overriding configured default SRID to {} on all Postgres connections because of a CLI parameter", self.default_srid.unwrap());
pg_config.iter_mut().for_each(|c| {
c.default_srid = self.default_srid;
});
}
if self.pool_size.is_some() {
info!("Overriding configured pool size to {} on all Postgres connections because of a CLI parameter", self.pool_size.unwrap());
pg_config.iter_mut().for_each(|c| {
c.pool_size = self.pool_size;
});
}
#[cfg(feature = "ssl")]
if self.ca_root_file.is_some() {
info!("Overriding root certificate file to {} on all Postgres connections because of a CLI parameter",
self.ca_root_file.as_ref().unwrap().display());
pg_config.iter_mut().for_each(|c| {
c.ca_root_file = self.ca_root_file.clone();
});
}
#[cfg(feature = "ssl")]
if self.danger_accept_invalid_certs {
info!("Overriding configured setting: all Postgres connections will accept invalid certificates because of a CLI parameter. This is a dangerous option, and should not be used if possible.");
pg_config.iter_mut().for_each(|c| {
c.danger_accept_invalid_certs = self.danger_accept_invalid_certs;
});
}
for v in &[
"CA_ROOT_FILE",
"DANGER_ACCEPT_INVALID_CERTS",
"DATABASE_URL",
"DEFAULT_SRID",
] {
// We don't want to warn about these in case they were used in the config file expansion
if env.has_unused_var(v) {
warn!("Environment variable {v} is set, but will be ignored because a configuration file was loaded. Any environment variables can be used inside the config yaml file.");
}
}
}
fn extract_conn_strings(meta: &mut MetaArgs, env: &impl Env) -> Vec<String> {
let mut strings = Vec::new();
let mut i = 0;
while i < meta.connection.len() {
if is_postgresql_string(&meta.connection[i]) {
strings.push(meta.connection.remove(i));
} else {
i += 1;
}
}
if strings.is_empty() {
if let Some(s) = env.get_env_str("DATABASE_URL") {
if is_postgresql_string(&s) {
info!("Using env var DATABASE_URL to connect to PostgreSQL");
strings.push(s);
} else {
warn!("Environment var DATABASE_URL is not a valid postgres connection string");
}
}
}
strings
}
fn get_default_srid(&self, env: &impl Env) -> Option<i32> {
if self.default_srid.is_some() {
return self.default_srid;
}
env.get_env_str("DEFAULT_SRID") env.get_env_str("DEFAULT_SRID")
.and_then(|srid| match srid.parse::<i32>() { .and_then(|srid| match srid.parse::<i32>() {
Ok(v) => Some(v), Ok(v) => {
info!("Using env var DEFAULT_SRID={v} to set default SRID");
Some(v)
}
Err(v) => { Err(v) => {
warn!("Env var DEFAULT_SRID is not a valid integer {srid}: {v}"); warn!("Env var DEFAULT_SRID is not a valid integer {srid}: {v}");
None None
} }
}) })
}), }
pool_size: args.pool_size,
..Default::default()
})
.collect();
match builders.len() { #[cfg(feature = "ssl")]
0 => None, fn get_accept_invalid_cert(&self, env: &impl Env) -> bool {
1 => Some(OneOrMany::One(builders.into_iter().next().unwrap())), if !self.danger_accept_invalid_certs
_ => Some(OneOrMany::Many(builders)), && env.get_env_str("DANGER_ACCEPT_INVALID_CERTS").is_some()
{
info!("Using env var DANGER_ACCEPT_INVALID_CERTS to trust invalid certificates");
true
} else {
self.danger_accept_invalid_certs
}
}
#[cfg(feature = "ssl")]
fn get_ca_root_file(&self, env: &impl Env) -> Option<std::path::PathBuf> {
if self.ca_root_file.is_some() {
return self.ca_root_file.clone();
}
let path = env.var_os("CA_ROOT_FILE").map(std::path::PathBuf::from);
if let Some(path) = &path {
info!(
"Using env var CA_ROOT_FILE={} to load trusted root certificates",
path.display()
);
}
path
}
}
#[must_use]
fn is_postgresql_string(s: &str) -> bool {
s.starts_with("postgresql://") || s.starts_with("postgres://")
}
#[cfg(test)]
mod tests {
use super::*;
use crate::test_utils::{os, some, FauxEnv};
#[test]
fn test_extract_conn_strings() {
let mut meta = MetaArgs {
connection: vec![
"postgresql://localhost:5432".to_string(),
"postgres://localhost:5432".to_string(),
"mysql://localhost:3306".to_string(),
],
..Default::default()
};
assert_eq!(
PgArgs::extract_conn_strings(&mut meta, &FauxEnv::default()),
vec!["postgresql://localhost:5432", "postgres://localhost:5432"]
);
assert_eq!(meta.connection, vec!["mysql://localhost:3306"]);
}
#[test]
fn test_extract_conn_strings_from_env() {
let mut meta = MetaArgs {
..Default::default()
};
let env = FauxEnv(
vec![("DATABASE_URL", os("postgresql://localhost:5432"))]
.into_iter()
.collect(),
);
let strings = PgArgs::extract_conn_strings(&mut meta, &env);
assert_eq!(strings, vec!["postgresql://localhost:5432"]);
assert_eq!(meta.connection, Vec::<String>::new());
}
#[test]
fn test_merge_into_config() {
let mut meta = MetaArgs {
connection: vec!["postgres://localhost:5432".to_string()],
..Default::default()
};
let config = PgArgs::default().into_config(&mut meta, &FauxEnv::default());
assert_eq!(
config,
Some(OneOrMany::One(PgConfig {
connection_string: some("postgres://localhost:5432"),
..Default::default()
}))
);
assert_eq!(meta.connection, Vec::<String>::new());
}
#[test]
fn test_merge_into_config2() {
let mut meta = MetaArgs::default();
let env = FauxEnv(
vec![
("DATABASE_URL", os("postgres://localhost:5432")),
("DEFAULT_SRID", os("10")),
("DANGER_ACCEPT_INVALID_CERTS", os("1")),
("CA_ROOT_FILE", os("file")),
]
.into_iter()
.collect(),
);
let config = PgArgs::default().into_config(&mut meta, &env);
assert_eq!(
config,
Some(OneOrMany::One(PgConfig {
connection_string: some("postgres://localhost:5432"),
default_srid: Some(10),
#[cfg(feature = "ssl")]
danger_accept_invalid_certs: true,
#[cfg(feature = "ssl")]
ca_root_file: Some(std::path::PathBuf::from("file")),
..Default::default()
}))
);
}
#[test]
fn test_merge_into_config3() {
let mut meta = MetaArgs::default();
let env = FauxEnv(
vec![
("DATABASE_URL", os("postgres://localhost:5432")),
("DEFAULT_SRID", os("10")),
("CA_ROOT_FILE", os("file")),
]
.into_iter()
.collect(),
);
let pg_args = PgArgs {
#[cfg(feature = "ssl")]
ca_root_file: Some(std::path::PathBuf::from("file2")),
#[cfg(feature = "ssl")]
danger_accept_invalid_certs: true,
default_srid: Some(20),
..Default::default()
};
let config = pg_args.into_config(&mut meta, &env);
assert_eq!(
config,
Some(OneOrMany::One(PgConfig {
connection_string: some("postgres://localhost:5432"),
default_srid: Some(20),
#[cfg(feature = "ssl")]
danger_accept_invalid_certs: true,
#[cfg(feature = "ssl")]
ca_root_file: Some(std::path::PathBuf::from("file2")),
..Default::default()
}))
);
} }
} }

159
src/args/root.rs Normal file
View File

@ -0,0 +1,159 @@
use std::path::PathBuf;
use clap::Parser;
use log::warn;
use crate::args::environment::Env;
use crate::args::pg::PgArgs;
use crate::args::srv::SrvArgs;
use crate::config::Config;
use crate::{Error, Result};
#[derive(Parser, Debug, PartialEq, Default)]
#[command(about, version)]
pub struct Args {
#[command(flatten)]
pub meta: MetaArgs,
#[command(flatten)]
pub srv: SrvArgs,
#[command(flatten)]
pub pg: Option<PgArgs>,
}
// None of these params will be transferred to the config
#[derive(Parser, Debug, Clone, PartialEq, Default)]
#[command(about, version)]
pub struct MetaArgs {
// config may need a conflicts_with = "SourcesArgs"
// see https://github.com/clap-rs/clap/discussions/4562
/// Path to config file. If set, no tile source-related parameters are allowed.
#[arg(short, long)]
pub config: Option<PathBuf>,
/// Save resulting config to a file or use "-" to print to stdout.
/// By default, only print if sources are auto-detected.
#[arg(long)]
pub save_config: Option<PathBuf>,
/// [Deprecated] Scan for new sources on sources list requests
#[arg(short, long, hide = true)]
pub watch: bool,
/// Database connection strings
pub connection: Vec<String>,
}
impl Args {
pub fn merge_into_config(mut self, config: &mut Config, env: &impl Env) -> Result<()> {
if self.meta.watch {
warn!("The --watch flag is no longer supported, and will be ignored");
}
if env.has_unused_var("WATCH_MODE") {
warn!("The WATCH_MODE env variable is no longer supported, and will be ignored");
}
if self.meta.config.is_some() && !self.meta.connection.is_empty() {
return Err(Error::ConfigAndConnectionsError);
}
self.srv.merge_into_config(&mut config.srv);
let pg_args = self.pg.unwrap_or_default();
if let Some(pg_config) = &mut config.postgres {
// config was loaded from a file, we can only apply a few CLI overrides to it
pg_args.override_config(pg_config, env);
} else {
config.postgres = pg_args.into_config(&mut self.meta, env);
}
if self.meta.connection.is_empty() {
Ok(())
} else {
let connections = self.meta.connection.clone();
Err(Error::UnrecognizableConnections(connections))
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::pg::PgConfig;
use crate::test_utils::{some, FauxEnv};
use crate::utils::OneOrMany;
fn parse(args: &[&str]) -> Result<(Config, MetaArgs)> {
let args = Args::parse_from(args);
let meta = args.meta.clone();
let mut config = Config::default();
args.merge_into_config(&mut config, &FauxEnv::default())?;
Ok((config, meta))
}
#[test]
fn cli_no_args() {
let args = parse(&["martin"]).unwrap();
let expected = (Config::default(), MetaArgs::default());
assert_eq!(args, expected);
}
#[test]
fn cli_with_config() {
let args = parse(&["martin", "--config", "c.toml"]).unwrap();
let meta = MetaArgs {
config: Some(PathBuf::from("c.toml")),
..Default::default()
};
assert_eq!(args, (Config::default(), meta));
let args = parse(&["martin", "--config", "c.toml", "--save-config", "s.toml"]).unwrap();
let meta = MetaArgs {
config: Some(PathBuf::from("c.toml")),
save_config: Some(PathBuf::from("s.toml")),
..Default::default()
};
assert_eq!(args, (Config::default(), meta));
let args = parse(&["martin", "postgres://connection"]).unwrap();
let cfg = Config {
postgres: Some(OneOrMany::One(PgConfig {
connection_string: some("postgres://connection"),
..Default::default()
})),
..Default::default()
};
let meta = MetaArgs {
connection: vec!["postgres://connection".to_string()],
..Default::default()
};
assert_eq!(args, (cfg, meta));
}
#[test]
fn cli_bad_arguments() {
for params in [
["martin", "--config", "c.toml", "--tmp"].as_slice(),
["martin", "--config", "c.toml", "-c", "t.toml"].as_slice(),
] {
let res = Args::try_parse_from(params);
assert!(res.is_err(), "Expected error, got: {res:?} for {params:?}");
}
}
#[test]
fn cli_bad_parsed_arguments() {
let args = Args::parse_from(["martin", "--config", "c.toml", "postgres://a"]);
let env = FauxEnv::default();
let mut config = Config::default();
let err = args.merge_into_config(&mut config, &env).unwrap_err();
assert!(matches!(err, crate::Error::ConfigAndConnectionsError));
}
#[test]
fn cli_unknown_con_str() {
let args = Args::parse_from(["martin", "foobar"]);
let env = FauxEnv::default();
let mut config = Config::default();
let err = args.merge_into_config(&mut config, &env).unwrap_err();
let bad = vec!["foobar".to_string()];
assert!(matches!(err, crate::Error::UnrecognizableConnections(v) if v == bad));
}
}

View File

@ -1,4 +1,4 @@
use crate::srv::config::{SrvConfig, KEEP_ALIVE_DEFAULT, LISTEN_ADDRESSES_DEFAULT}; use crate::srv::{SrvConfig, KEEP_ALIVE_DEFAULT, LISTEN_ADDRESSES_DEFAULT};
#[derive(clap::Args, Debug, PartialEq, Default)] #[derive(clap::Args, Debug, PartialEq, Default)]
#[command(about, version)] #[command(about, version)]
@ -12,12 +12,17 @@ pub struct SrvArgs {
pub workers: Option<usize>, pub workers: Option<usize>,
} }
impl From<SrvArgs> for SrvConfig { impl SrvArgs {
fn from(args: SrvArgs) -> Self { pub(crate) fn merge_into_config(self, srv_config: &mut SrvConfig) {
SrvConfig { // Override config values with the ones from the command line
keep_alive: args.keep_alive, if self.keep_alive.is_some() {
listen_addresses: args.listen_addresses, srv_config.keep_alive = self.keep_alive;
worker_processes: args.workers, }
if self.listen_addresses.is_some() {
srv_config.listen_addresses = self.listen_addresses;
}
if self.workers.is_some() {
srv_config.worker_processes = self.workers;
} }
} }
} }

View File

@ -1,40 +1,35 @@
use actix_web::dev::Server;
use clap::Parser;
use log::info;
use martin::args::Args;
use martin::config::{read_config, Config};
use martin::pg::config::PgConfig;
use martin::source::IdResolver;
use martin::srv::server;
use martin::srv::server::RESERVED_KEYWORDS;
use martin::Error::ConfigWriteError;
use martin::Result;
use std::env;
use std::ffi::OsStr; use std::ffi::OsStr;
use std::fmt::Display; use std::fmt::Display;
use std::fs::File; use std::fs::File;
use std::io::Write; use std::io::Write;
use actix_web::dev::Server;
use clap::Parser;
use log::{error, info, log_enabled};
use martin::args::{Args, OsEnv};
use martin::pg::PgConfig;
use martin::srv::{new_server, RESERVED_KEYWORDS};
use martin::Error::ConfigWriteError;
use martin::{read_config, Config, IdResolver, Result};
const VERSION: &str = env!("CARGO_PKG_VERSION"); const VERSION: &str = env!("CARGO_PKG_VERSION");
async fn start(args: Args) -> Result<Server> { async fn start(args: Args) -> Result<Server> {
info!("Starting Martin v{VERSION}"); info!("Starting Martin v{VERSION}");
let env = OsEnv::default();
let save_config = args.meta.save_config.clone(); let save_config = args.meta.save_config.clone();
let file_cfg = if let Some(ref cfg_filename) = args.meta.config { let mut config = if let Some(ref cfg_filename) = args.meta.config {
info!("Using {}", cfg_filename.display()); info!("Using {}", cfg_filename.display());
Some(read_config(cfg_filename)?) read_config(cfg_filename, &env)?
} else { } else {
info!("Config file is not specified, auto-detecting sources"); info!("Config file is not specified, auto-detecting sources");
None Config::default()
}; };
let mut args_cfg = Config::try_from(args)?;
if let Some(file_cfg) = file_cfg { args.merge_into_config(&mut config, &env)?;
args_cfg.merge(file_cfg); config.finalize()?;
} let sources = config.resolve(IdResolver::new(RESERVED_KEYWORDS)).await?;
let id_resolver = IdResolver::new(RESERVED_KEYWORDS);
let mut config = args_cfg.finalize()?;
let sources = config.resolve(id_resolver).await?;
if let Some(file_name) = save_config { if let Some(file_name) = save_config {
let yaml = serde_yaml::to_string(&config).expect("Unable to serialize config"); let yaml = serde_yaml::to_string(&config).expect("Unable to serialize config");
@ -60,7 +55,7 @@ async fn start(args: Args) -> Result<Server> {
info!("Use --save-config to save or print Martin configuration."); info!("Use --save-config to save or print Martin configuration.");
} }
let (server, listen_addresses) = server::new(config.srv, sources); let (server, listen_addresses) = new_server(config.srv, sources)?;
info!("Martin has been started on {listen_addresses}."); info!("Martin has been started on {listen_addresses}.");
info!("Use http://{listen_addresses}/catalog to get the list of available sources."); info!("Use http://{listen_addresses}/catalog to get the list of available sources.");
@ -80,6 +75,11 @@ async fn main() {
} }
fn on_error<E: Display>(e: E) -> ! { fn on_error<E: Display>(e: E) -> ! {
// Ensure the message is printed, even if the logging is disabled
if log_enabled!(log::Level::Error) {
error!("{e}");
} else {
eprintln!("{e}"); eprintln!("{e}");
}
std::process::exit(1); std::process::exit(1);
} }

View File

@ -1,20 +1,20 @@
use crate::one_or_many::OneOrMany;
use crate::pg::config::PgConfig;
use crate::source::IdResolver;
use crate::srv::config::SrvConfig;
use crate::srv::server::Sources;
use crate::utils;
use crate::utils::Error::{ConfigLoadError, ConfigParseError};
use crate::utils::Result;
use futures::future::try_join_all;
use log::warn;
use serde::{Deserialize, Serialize};
use serde_yaml::Value;
use std::collections::HashMap; use std::collections::HashMap;
use std::fs::File; use std::fs::File;
use std::io::prelude::*; use std::io::prelude::*;
use std::path::Path; use std::path::Path;
use futures::future::try_join_all;
use log::warn;
use serde::{Deserialize, Serialize};
use serde_yaml::Value;
use crate::args::OsEnv;
use crate::pg::PgConfig;
use crate::source::{IdResolver, Sources};
use crate::srv::SrvConfig;
use crate::utils::{OneOrMany, Result};
use crate::Error::{ConfigLoadError, ConfigParseError, NoSources};
#[derive(Clone, Debug, Default, PartialEq, Serialize, Deserialize)] #[derive(Clone, Debug, Default, PartialEq, Serialize, Deserialize)]
pub struct Config { pub struct Config {
#[serde(flatten)] #[serde(flatten)]
@ -28,6 +28,25 @@ pub struct Config {
} }
impl Config { impl Config {
/// Apply defaults to the config, and validate if there is a connection string
pub fn finalize(&mut self) -> Result<&Self> {
report_unrecognized_config("", &self.unrecognized);
let any = if let Some(pg) = &mut self.postgres {
for pg in pg.iter_mut() {
pg.finalize()?;
}
!pg.is_empty()
} else {
false
};
if any {
Ok(self)
} else {
Err(NoSources)
}
}
pub async fn resolve(&mut self, idr: IdResolver) -> Result<Sources> { pub async fn resolve(&mut self, idr: IdResolver) -> Result<Sources> {
if let Some(mut pg) = self.postgres.take() { if let Some(mut pg) = self.postgres.take() {
Ok(try_join_all(pg.iter_mut().map(|s| s.resolve(idr.clone()))) Ok(try_join_all(pg.iter_mut().map(|s| s.resolve(idr.clone())))
@ -42,55 +61,6 @@ impl Config {
Ok(HashMap::new()) Ok(HashMap::new())
} }
} }
pub fn merge(&mut self, other: Self) {
self.unrecognized.extend(other.unrecognized);
self.srv.merge(other.srv);
if let Some(other) = other.postgres {
match &mut self.postgres {
Some(_first) => {
unimplemented!("merging multiple postgres configs is not yet supported");
// first.merge(other);
}
None => self.postgres = Some(other),
}
}
}
/// Apply defaults to the config, and validate if there is a connection string
pub fn finalize(self) -> Result<Config> {
report_unrecognized_config("", &self.unrecognized);
Ok(Config {
srv: self.srv,
postgres: self
.postgres
.map(|pg| pg.map(|v| v.finalize().map_err(utils::Error::PostgresError)))
.transpose()?,
unrecognized: self.unrecognized,
})
}
}
/// Update empty option in place with a non-empty value from the second option.
pub fn set_option<T>(first: &mut Option<T>, second: Option<T>) {
if first.is_none() && second.is_some() {
*first = second;
}
}
/// Merge two options
#[must_use]
pub fn merge_option<T>(
first: Option<T>,
second: Option<T>,
merge: impl FnOnce(T, T) -> T,
) -> Option<T> {
match (first, second) {
(Some(first), Some(second)) => Some(merge(first, second)),
(None, Some(second)) => Some(second),
(first, None) => first,
}
} }
pub fn report_unrecognized_config(prefix: &str, unrecognized: &HashMap<String, Value>) { pub fn report_unrecognized_config(prefix: &str, unrecognized: &HashMap<String, Value>) {
@ -100,34 +70,45 @@ pub fn report_unrecognized_config(prefix: &str, unrecognized: &HashMap<String, V
} }
/// Read config from a file /// Read config from a file
pub fn read_config(file_name: &Path) -> Result<Config> { pub fn read_config(file_name: &Path, env: &OsEnv) -> Result<Config> {
let mut file = File::open(file_name).map_err(|e| ConfigLoadError(e, file_name.into()))?; let mut file = File::open(file_name).map_err(|e| ConfigLoadError(e, file_name.into()))?;
let mut contents = String::new(); let mut contents = String::new();
file.read_to_string(&mut contents) file.read_to_string(&mut contents)
.map_err(|e| ConfigLoadError(e, file_name.into()))?; .map_err(|e| ConfigLoadError(e, file_name.into()))?;
subst::yaml::from_str(contents.as_str(), &subst::Env) subst::yaml::from_str(contents.as_str(), env).map_err(|e| ConfigParseError(e, file_name.into()))
.map_err(|e| ConfigParseError(e, file_name.into()))
} }
#[cfg(test)] #[cfg(test)]
mod tests { pub mod tests {
use super::*;
use crate::pg::utils::tests::{assert_config, some_str};
use indoc::indoc; use indoc::indoc;
use super::*;
use crate::config::Config;
use crate::test_utils::some;
pub fn parse_config(yaml: &str) -> Config {
serde_yaml::from_str(yaml).expect("parse yaml")
}
pub fn assert_config(yaml: &str, expected: &Config) {
let mut config = parse_config(yaml);
config.finalize().expect("finalize");
assert_eq!(&config, expected);
}
#[test] #[test]
fn parse_config() { fn parse_empty_config() {
assert_config( assert_eq!(
indoc! {" parse_config(indoc! {"
--- ---
keep_alive: 75 keep_alive: 75
listen_addresses: '0.0.0.0:3000' listen_addresses: '0.0.0.0:3000'
worker_processes: 8 worker_processes: 8
"}, "}),
&Config { Config {
srv: SrvConfig { srv: SrvConfig {
keep_alive: Some(75), keep_alive: Some(75),
listen_addresses: some_str("0.0.0.0:3000"), listen_addresses: some("0.0.0.0:3000"),
worker_processes: Some(8), worker_processes: Some(8),
}, },
..Default::default() ..Default::default()

View File

@ -7,15 +7,23 @@
#![allow(clippy::module_name_repetitions)] #![allow(clippy::module_name_repetitions)]
pub mod args; pub mod args;
pub mod config; mod config;
pub mod one_or_many;
pub mod pg; pub mod pg;
pub mod source; mod source;
pub mod srv; pub mod srv;
pub mod utils; mod utils;
pub use crate::utils::Error; #[cfg(test)]
pub use crate::utils::Result; #[path = "utils/test_utils.rs"]
mod test_utils;
// test_utils is used from tests in other modules, and it uses this crate's object.
// Must make it accessible as carte::Env from both places when testing.
#[cfg(test)]
pub use crate::args::Env;
pub use crate::config::{read_config, Config};
pub use crate::source::{IdResolver, Source, Sources, Xyz};
pub use crate::utils::{Error, Result};
// Ensure README.md contains valid code // Ensure README.md contains valid code
#[cfg(doctest)] #[cfg(doctest)]

View File

@ -1,81 +0,0 @@
use serde::{Deserialize, Serialize};
use std::mem;
use std::slice::Iter;
use std::vec::IntoIter;
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(untagged)]
pub enum OneOrMany<T> {
One(T),
Many(Vec<T>),
}
impl<T> IntoIterator for OneOrMany<T> {
type Item = T;
type IntoIter = IntoIter<T>;
fn into_iter(self) -> Self::IntoIter {
match self {
// OneOrMany::One(s) => OneOrManyIter::One(Some(s)),
// OneOrMany::Many(v) => OneOrManyIter::Many(v.into_iter()),
OneOrMany::One(v) => vec![v].into_iter(),
OneOrMany::Many(v) => v.into_iter(),
}
}
}
impl<T: Clone> OneOrMany<T> {
pub fn iter_mut(&mut self) -> impl Iterator<Item = &mut T> {
match self {
OneOrMany::Many(v) => v.iter_mut(),
OneOrMany::One(v) => std::slice::from_mut(v).iter_mut(),
}
}
pub fn iter(&self) -> Iter<T> {
self.as_slice().iter()
}
pub fn as_slice(&self) -> &[T] {
match self {
OneOrMany::One(item) => std::slice::from_ref(item),
OneOrMany::Many(v) => v.as_slice(),
}
}
pub fn map<R: Clone, F>(self, mut f: F) -> crate::Result<OneOrMany<R>>
where
F: FnMut(T) -> crate::Result<R>,
{
Ok(match self {
Self::One(v) => OneOrMany::One(f(v)?),
Self::Many(v) => OneOrMany::Many(v.into_iter().map(f).collect::<crate::Result<_>>()?),
})
}
pub fn generalize(self) -> Vec<T> {
match self {
Self::One(v) => vec![v],
Self::Many(v) => v,
}
}
pub fn merge(&mut self, other: Self) {
// There is no allocation with Vec::new()
*self = match (mem::replace(self, Self::Many(Vec::new())), other) {
(Self::One(a), Self::One(b)) => Self::Many(vec![a, b]),
(Self::One(a), Self::Many(mut b)) => {
b.insert(0, a);
Self::Many(b)
}
(Self::Many(mut a), Self::One(b)) => {
a.push(b);
Self::Many(a)
}
(Self::Many(mut a), Self::Many(b)) => {
a.extend(b);
Self::Many(a)
}
};
}
}

View File

@ -1,16 +1,14 @@
use crate::config::{report_unrecognized_config, set_option}; use futures::future::try_join;
use serde::{Deserialize, Serialize};
use tilejson::TileJSON;
use crate::config::report_unrecognized_config;
use crate::pg::config_function::FuncInfoSources; use crate::pg::config_function::FuncInfoSources;
use crate::pg::config_table::TableInfoSources; use crate::pg::config_table::TableInfoSources;
use crate::pg::configurator::PgBuilder; use crate::pg::configurator::PgBuilder;
use crate::pg::pool::Pool; use crate::pg::pool::Pool;
use crate::pg::utils::PgError::NoConnectionString; use crate::pg::utils::{Result, Schemas};
use crate::pg::utils::Result; use crate::source::{IdResolver, Sources};
use crate::source::IdResolver;
use crate::srv::server::Sources;
use crate::utils::Schemas;
use futures::future::try_join;
use serde::{Deserialize, Serialize};
use tilejson::TileJSON;
pub trait PgInfo { pub trait PgInfo {
fn format_id(&self) -> String; fn format_id(&self) -> String;
@ -43,24 +41,8 @@ pub struct PgConfig {
} }
impl PgConfig { impl PgConfig {
pub fn merge(&mut self, other: Self) -> &mut Self {
set_option(&mut self.connection_string, other.connection_string);
#[cfg(feature = "ssl")]
{
set_option(&mut self.ca_root_file, other.ca_root_file);
self.danger_accept_invalid_certs |= other.danger_accept_invalid_certs;
}
set_option(&mut self.default_srid, other.default_srid);
set_option(&mut self.pool_size, other.pool_size);
set_option(&mut self.auto_tables, other.auto_tables);
set_option(&mut self.auto_functions, other.auto_functions);
set_option(&mut self.tables, other.tables);
set_option(&mut self.functions, other.functions);
self
}
/// Apply defaults to the config, and validate if there is a connection string /// Apply defaults to the config, and validate if there is a connection string
pub fn finalize(self) -> Result<PgConfig> { pub fn finalize(&mut self) -> Result<&Self> {
if let Some(ref ts) = self.tables { if let Some(ref ts) = self.tables {
for (k, v) in ts { for (k, v) in ts {
report_unrecognized_config(&format!("tables.{k}."), &v.unrecognized); report_unrecognized_config(&format!("tables.{k}."), &v.unrecognized);
@ -71,13 +53,9 @@ impl PgConfig {
report_unrecognized_config(&format!("functions.{k}."), &v.unrecognized); report_unrecognized_config(&format!("functions.{k}."), &v.unrecognized);
} }
} }
let connection_string = self.connection_string.ok_or(NoConnectionString)?; self.run_autodiscovery = self.tables.is_none() && self.functions.is_none();
Ok(PgConfig { Ok(self)
connection_string: Some(connection_string),
run_autodiscovery: self.tables.is_none() && self.functions.is_none(),
..self
})
} }
pub async fn resolve(&mut self, id_resolver: IdResolver) -> Result<(Sources, Pool)> { pub async fn resolve(&mut self, id_resolver: IdResolver) -> Result<(Sources, Pool)> {
@ -97,26 +75,23 @@ impl PgConfig {
} }
} }
#[must_use]
pub fn is_postgresql_string(s: &str) -> bool {
s.starts_with("postgresql://") || s.starts_with("postgres://")
}
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*;
use crate::config::Config;
use crate::one_or_many::OneOrMany::{Many, One};
use crate::pg::config_function::FunctionInfo;
use crate::pg::config_table::TableInfo;
use crate::pg::utils::tests::{assert_config, some_str};
use indoc::indoc;
use std::collections::HashMap; use std::collections::HashMap;
use indoc::indoc;
use tilejson::Bounds; use tilejson::Bounds;
use super::*;
use crate::config::tests::assert_config;
use crate::config::Config;
use crate::pg::config_function::FunctionInfo;
use crate::pg::config_table::TableInfo;
use crate::test_utils::some;
use crate::utils::OneOrMany::{Many, One};
#[test] #[test]
#[allow(clippy::too_many_lines)] fn parse_pg_one() {
fn parse_config() {
assert_config( assert_config(
indoc! {" indoc! {"
--- ---
@ -125,14 +100,17 @@ mod tests {
"}, "},
&Config { &Config {
postgres: Some(One(PgConfig { postgres: Some(One(PgConfig {
connection_string: some_str("postgresql://postgres@localhost/db"), connection_string: some("postgresql://postgres@localhost/db"),
run_autodiscovery: true, run_autodiscovery: true,
..Default::default() ..Default::default()
})), })),
..Default::default() ..Default::default()
}, },
); );
}
#[test]
fn parse_pg_two() {
assert_config( assert_config(
indoc! {" indoc! {"
--- ---
@ -143,12 +121,12 @@ mod tests {
&Config { &Config {
postgres: Some(Many(vec![ postgres: Some(Many(vec![
PgConfig { PgConfig {
connection_string: some_str("postgres://postgres@localhost:5432/db"), connection_string: some("postgres://postgres@localhost:5432/db"),
run_autodiscovery: true, run_autodiscovery: true,
..Default::default() ..Default::default()
}, },
PgConfig { PgConfig {
connection_string: some_str("postgresql://postgres@localhost:5433/db"), connection_string: some("postgresql://postgres@localhost:5433/db"),
run_autodiscovery: true, run_autodiscovery: true,
..Default::default() ..Default::default()
}, },
@ -156,7 +134,10 @@ mod tests {
..Default::default() ..Default::default()
}, },
); );
}
#[test]
fn parse_pg_config() {
assert_config( assert_config(
indoc! {" indoc! {"
--- ---
@ -192,7 +173,7 @@ mod tests {
"}, "},
&Config { &Config {
postgres: Some(One(PgConfig { postgres: Some(One(PgConfig {
connection_string: some_str("postgres://postgres@localhost:5432/db"), connection_string: some("postgres://postgres@localhost:5432/db"),
default_srid: Some(4326), default_srid: Some(4326),
pool_size: Some(20), pool_size: Some(20),
tables: Some(HashMap::from([( tables: Some(HashMap::from([(
@ -208,7 +189,7 @@ mod tests {
extent: Some(4096), extent: Some(4096),
buffer: Some(64), buffer: Some(64),
clip_geom: Some(true), clip_geom: Some(true),
geometry_type: some_str("GEOMETRY"), geometry_type: some("GEOMETRY"),
properties: HashMap::from([("gid".to_string(), "int4".to_string())]), properties: HashMap::from([("gid".to_string(), "int4".to_string())]),
..Default::default() ..Default::default()
}, },

View File

@ -1,10 +1,12 @@
use std::collections::HashMap;
use serde::{Deserialize, Serialize};
use serde_yaml::Value;
use tilejson::{Bounds, TileJSON};
use crate::pg::config::PgInfo; use crate::pg::config::PgInfo;
use crate::pg::utils::create_tilejson; use crate::pg::utils::create_tilejson;
use crate::utils::InfoMap; use crate::utils::InfoMap;
use serde::{Deserialize, Serialize};
use serde_yaml::Value;
use std::collections::HashMap;
use tilejson::{Bounds, TileJSON};
pub type FuncInfoSources = InfoMap<FunctionInfo>; pub type FuncInfoSources = InfoMap<FunctionInfo>;

View File

@ -1,10 +1,12 @@
use std::collections::HashMap;
use serde::{Deserialize, Serialize};
use serde_yaml::Value;
use tilejson::{Bounds, TileJSON};
use crate::pg::config::PgInfo; use crate::pg::config::PgInfo;
use crate::pg::utils::create_tilejson; use crate::pg::utils::create_tilejson;
use crate::utils::InfoMap; use crate::utils::InfoMap;
use serde::{Deserialize, Serialize};
use serde_yaml::Value;
use std::collections::HashMap;
use tilejson::{Bounds, TileJSON};
pub type TableInfoSources = InfoMap<TableInfo>; pub type TableInfoSources = InfoMap<TableInfo>;

View File

@ -1,3 +1,10 @@
use std::cmp::Ordering;
use std::collections::{HashMap, HashSet};
use futures::future::join_all;
use itertools::Itertools;
use log::{debug, error, info, warn};
use crate::pg::config::{PgConfig, PgInfo}; use crate::pg::config::{PgConfig, PgInfo};
use crate::pg::config_function::{FuncInfoSources, FunctionInfo}; use crate::pg::config_function::{FuncInfoSources, FunctionInfo};
use crate::pg::config_table::{TableInfo, TableInfoSources}; use crate::pg::config_table::{TableInfo, TableInfoSources};
@ -6,15 +13,9 @@ use crate::pg::pg_source::{PgSource, PgSqlInfo};
use crate::pg::pool::Pool; use crate::pg::pool::Pool;
use crate::pg::table_source::{calc_srid, get_table_sources, merge_table_info, table_to_query}; use crate::pg::table_source::{calc_srid, get_table_sources, merge_table_info, table_to_query};
use crate::pg::utils::PgError::InvalidTableExtent; use crate::pg::utils::PgError::InvalidTableExtent;
use crate::pg::utils::Result; use crate::pg::utils::{Result, Schemas};
use crate::source::IdResolver; use crate::source::{IdResolver, Sources};
use crate::srv::server::Sources; use crate::utils::{find_info, normalize_key, InfoMap};
use crate::utils::{find_info, normalize_key, InfoMap, Schemas};
use futures::future::join_all;
use itertools::Itertools;
use log::{debug, error, info, warn};
use std::cmp::Ordering;
use std::collections::{HashMap, HashSet};
pub type SqlFuncInfoMapMap = InfoMap<InfoMap<(PgSqlInfo, FunctionInfo)>>; pub type SqlFuncInfoMapMap = InfoMap<InfoMap<(PgSqlInfo, FunctionInfo)>>;
pub type SqlTableInfoMapMapMap = InfoMap<InfoMap<InfoMap<TableInfo>>>; pub type SqlTableInfoMapMapMap = InfoMap<InfoMap<InfoMap<TableInfo>>>;
@ -62,7 +63,7 @@ impl PgBuilder {
let Some(tables) = find_info(schemas, &cfg_inf.table, "table", id) else { continue }; let Some(tables) = find_info(schemas, &cfg_inf.table, "table", id) else { continue };
let Some(src_inf) = find_info(tables, &cfg_inf.geometry_column, "geometry column", id) else { continue }; let Some(src_inf) = find_info(tables, &cfg_inf.geometry_column, "geometry column", id) else { continue };
let dup = used.insert((&cfg_inf.schema, &cfg_inf.table, &cfg_inf.geometry_column)); let dup = !used.insert((&cfg_inf.schema, &cfg_inf.table, &cfg_inf.geometry_column));
let dup = if dup { "duplicate " } else { "" }; let dup = if dup { "duplicate " } else { "" };
let id2 = self.resolve_id(id.clone(), cfg_inf); let id2 = self.resolve_id(id.clone(), cfg_inf);

View File

@ -1,15 +1,17 @@
use std::collections::HashMap;
use std::fmt::Write;
use std::iter::zip;
use log::warn;
use postgres_protocol::escape::escape_identifier;
use serde_json::Value;
use crate::pg::config_function::FunctionInfo; use crate::pg::config_function::FunctionInfo;
use crate::pg::configurator::SqlFuncInfoMapMap; use crate::pg::configurator::SqlFuncInfoMapMap;
use crate::pg::pg_source::PgSqlInfo; use crate::pg::pg_source::PgSqlInfo;
use crate::pg::pool::Pool; use crate::pg::pool::Pool;
use crate::pg::utils::PgError::PostgresError; use crate::pg::utils::PgError::PostgresError;
use crate::pg::utils::Result; use crate::pg::utils::Result;
use log::warn;
use postgres_protocol::escape::escape_identifier;
use serde_json::Value;
use std::collections::HashMap;
use std::fmt::Write;
use std::iter::zip;
/// Get the list of functions from the database /// Get the list of functions from the database
/// ///
@ -90,7 +92,7 @@ pub async fn get_function_sources(pool: &Pool) -> Result<SqlFuncInfoMapMap> {
input_types.len() == 4, input_types.len() == 4,
format!( format!(
"{schema}.{function}({}) -> {ret_inf}", "{schema}.{function}({}) -> {ret_inf}",
input_names.join(", ") input_types.join(", ")
), ),
), ),
FunctionInfo::new(schema, function), FunctionInfo::new(schema, function),

View File

@ -1,9 +1,16 @@
pub mod config; mod config;
pub mod config_function; mod config_function;
pub mod config_table; mod config_table;
pub mod configurator; mod configurator;
pub mod function_source; mod function_source;
pub mod pg_source; mod pg_source;
pub mod pool; mod pool;
pub mod table_source; mod table_source;
pub mod utils; mod utils;
pub use config::PgConfig;
pub use config_function::FunctionInfo;
pub use config_table::TableInfo;
pub use function_source::get_function_sources;
pub use pool::{Pool, POOL_SIZE_DEFAULT};
pub use utils::{PgError, Schemas};

View File

@ -1,16 +1,18 @@
use crate::pg::pool::Pool; use std::collections::HashMap;
use crate::pg::utils::PgError::{GetTileError, GetTileWithQueryError, PrepareQueryError};
use crate::pg::utils::{is_valid_zoom, query_to_json};
use crate::source::{Source, Tile, UrlQuery, Xyz};
use crate::utils::Result;
use async_trait::async_trait; use async_trait::async_trait;
use bb8_postgres::tokio_postgres::types::ToSql; use bb8_postgres::tokio_postgres::types::ToSql;
use log::debug; use log::debug;
use martin_tile_utils::DataFormat; use martin_tile_utils::DataFormat;
use postgres::types::Type; use postgres::types::Type;
use std::collections::HashMap;
use tilejson::TileJSON; use tilejson::TileJSON;
use crate::pg::pool::Pool;
use crate::pg::utils::query_to_json;
use crate::pg::utils::PgError::{GetTileError, GetTileWithQueryError, PrepareQueryError};
use crate::source::{Source, Tile, UrlQuery, Xyz};
use crate::utils::{is_valid_zoom, Result};
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
pub struct PgSource { pub struct PgSource {
id: String, id: String,

View File

@ -1,13 +1,15 @@
use std::str::FromStr;
use bb8::PooledConnection;
use bb8_postgres::{tokio_postgres as pg, PostgresConnectionManager};
use log::{info, warn};
use semver::Version;
use crate::pg::config::PgConfig; use crate::pg::config::PgConfig;
use crate::pg::utils::PgError::{ use crate::pg::utils::PgError::{
BadConnectionString, BadPostgisVersion, PostgisTooOld, PostgresError, PostgresPoolConnError, BadConnectionString, BadPostgisVersion, PostgisTooOld, PostgresError, PostgresPoolConnError,
}; };
use crate::pg::utils::Result; use crate::pg::utils::Result;
use bb8::PooledConnection;
use bb8_postgres::{tokio_postgres as pg, PostgresConnectionManager};
use log::{info, warn};
use semver::Version;
use std::str::FromStr;
#[cfg(feature = "ssl")] #[cfg(feature = "ssl")]
pub type ConnectionManager = PostgresConnectionManager<postgres_openssl::MakeTlsConnector>; pub type ConnectionManager = PostgresConnectionManager<postgres_openssl::MakeTlsConnector>;
@ -50,9 +52,10 @@ impl Pool {
#[cfg(feature = "ssl")] #[cfg(feature = "ssl")]
let manager = { let manager = {
use crate::pg::utils::PgError::{BadTrustedRootCertError, BuildSslConnectorError};
use openssl::ssl::{SslConnector, SslMethod, SslVerifyMode}; use openssl::ssl::{SslConnector, SslMethod, SslVerifyMode};
use crate::pg::utils::PgError::{BadTrustedRootCertError, BuildSslConnectorError};
let tls = SslMethod::tls(); let tls = SslMethod::tls();
let mut builder = SslConnector::builder(tls).map_err(BuildSslConnectorError)?; let mut builder = SslConnector::builder(tls).map_err(BuildSslConnectorError)?;

View File

@ -48,7 +48,7 @@ WHERE jsonb_array_length(input_names) IN (3, 4)
AND input_types ->> 0 = 'integer' AND input_types ->> 0 = 'integer'
AND input_types ->> 1 = 'integer' AND input_types ->> 1 = 'integer'
AND input_types ->> 2 = 'integer' AND input_types ->> 2 = 'integer'
AND (input_types ->> 3 = 'json' OR (input_types ->> 3) IS NULL) AND (input_types ->> 3 = 'json' OR input_types ->> 3 = 'jsonb' OR (input_types ->> 3) IS NULL)
AND ( AND (
(data_type = 'bytea' AND out_params IS NULL) (data_type = 'bytea' AND out_params IS NULL)
OR (data_type = 'bytea' AND out_params = '["bytea"]'::jsonb) OR (data_type = 'bytea' AND out_params = '["bytea"]'::jsonb)

View File

@ -1,26 +1,22 @@
use std::collections::HashMap;
use log::{info, warn};
use postgis::ewkb;
use postgres_protocol::escape::{escape_identifier, escape_literal};
use crate::pg::config::PgInfo; use crate::pg::config::PgInfo;
use crate::pg::config_table::TableInfo; use crate::pg::config_table::TableInfo;
use crate::pg::configurator::SqlTableInfoMapMapMap; use crate::pg::configurator::SqlTableInfoMapMapMap;
use crate::pg::pg_source::PgSqlInfo; use crate::pg::pg_source::PgSqlInfo;
use crate::pg::pool::Pool; use crate::pg::pool::Pool;
use crate::pg::utils::PgError::PostgresError; use crate::pg::utils::PgError::PostgresError;
use crate::pg::utils::Result; use crate::pg::utils::{json_to_hashmap, polygon_to_bbox, Result};
use crate::pg::utils::{json_to_hashmap, polygon_to_bbox};
use crate::utils::normalize_key; use crate::utils::normalize_key;
use log::{info, warn};
use postgis::ewkb;
use postgres_protocol::escape::{escape_identifier, escape_literal};
use std::collections::HashMap;
static DEFAULT_EXTENT: u32 = 4096; static DEFAULT_EXTENT: u32 = 4096;
static DEFAULT_BUFFER: u32 = 64; static DEFAULT_BUFFER: u32 = 64;
static DEFAULT_CLIP_GEOM: bool = true; static DEFAULT_CLIP_GEOM: bool = true;
#[derive(Clone, Debug)]
pub struct PgSqlTableInfo {
pub info: TableInfo,
}
pub async fn get_table_sources(pool: &Pool) -> Result<SqlTableInfoMapMapMap> { pub async fn get_table_sources(pool: &Pool) -> Result<SqlTableInfoMapMapMap> {
let conn = pool.get().await?; let conn = pool.get().await?;
let rows = conn let rows = conn

View File

@ -1,13 +1,15 @@
use crate::source::{UrlQuery, Xyz}; use std::collections::HashMap;
use crate::utils::InfoMap;
use actix_http::header::HeaderValue; use itertools::Itertools;
use actix_web::http::Uri;
use postgis::{ewkb, LineString, Point, Polygon}; use postgis::{ewkb, LineString, Point, Polygon};
use postgres::types::Json; use postgres::types::Json;
use semver::Version; use semver::Version;
use std::collections::HashMap; use serde::{Deserialize, Serialize};
use tilejson::{tilejson, Bounds, TileJSON, VectorLayer}; use tilejson::{tilejson, Bounds, TileJSON, VectorLayer};
use crate::source::{UrlQuery, Xyz};
use crate::utils::InfoMap;
#[must_use] #[must_use]
pub fn json_to_hashmap(value: &serde_json::Value) -> InfoMap<String> { pub fn json_to_hashmap(value: &serde_json::Value) -> InfoMap<String> {
let mut hashmap = HashMap::new(); let mut hashmap = HashMap::new();
@ -51,14 +53,6 @@ pub fn polygon_to_bbox(polygon: &ewkb::Polygon) -> Option<Bounds> {
}) })
} }
pub fn parse_x_rewrite_url(header: &HeaderValue) -> Option<String> {
header
.to_str()
.ok()
.and_then(|header| header.parse::<Uri>().ok())
.map(|uri| uri.path().to_owned())
}
#[must_use] #[must_use]
pub fn create_tilejson( pub fn create_tilejson(
name: String, name: String,
@ -82,28 +76,6 @@ pub fn create_tilejson(
tilejson tilejson
} }
#[must_use]
pub fn is_valid_zoom(zoom: i32, minzoom: Option<u8>, maxzoom: Option<u8>) -> bool {
minzoom.map_or(true, |minzoom| zoom >= minzoom.into())
&& maxzoom.map_or(true, |maxzoom| zoom <= maxzoom.into())
}
#[cfg(test)]
pub(crate) mod tests {
use crate::config::Config;
pub fn assert_config(yaml: &str, expected: &Config) {
let config: Config = serde_yaml::from_str(yaml).expect("parse yaml");
let actual = config.finalize().expect("finalize");
assert_eq!(&actual, expected);
}
#[allow(clippy::unnecessary_wraps)]
pub fn some_str(s: &str) -> Option<String> {
Some(s.to_string())
}
}
pub type Result<T> = std::result::Result<T, PgError>; pub type Result<T> = std::result::Result<T, PgError>;
#[derive(thiserror::Error, Debug)] #[derive(thiserror::Error, Debug)]
@ -134,9 +106,6 @@ pub enum PgError {
#[error("PostGIS version {0} is too old, minimum required is {1}")] #[error("PostGIS version {0} is too old, minimum required is {1}")]
PostgisTooOld(Version, Version), PostgisTooOld(Version, Version),
#[error("Database connection string is not set")]
NoConnectionString,
#[error("Invalid extent setting in source {0} for table {1}: extent=0")] #[error("Invalid extent setting in source {0} for table {1}: extent=0")]
InvalidTableExtent(String, String), InvalidTableExtent(String, String),
@ -159,3 +128,33 @@ pub enum PgError {
UrlQuery, UrlQuery,
), ),
} }
/// A list of schemas to include in the discovery process, or a boolean to
/// indicate whether to run discovery at all.
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)]
#[serde(untagged)]
pub enum Schemas {
Bool(bool),
List(Vec<String>),
}
impl Schemas {
/// Returns a list of schemas to include in the discovery process.
/// If self is a true, returns a list of all schemas produced by the callback.
pub fn get<'a, I, F>(&self, keys: F) -> Vec<String>
where
I: Iterator<Item = &'a String>,
F: FnOnce() -> I,
{
match self {
Schemas::List(lst) => lst.clone(),
Schemas::Bool(all) => {
if *all {
keys().sorted().map(String::to_string).collect()
} else {
Vec::new()
}
}
}
}
}

View File

@ -1,13 +1,14 @@
use crate::utils::Result;
use async_trait::async_trait;
use martin_tile_utils::DataFormat;
use std::collections::hash_map::Entry; use std::collections::hash_map::Entry;
use std::collections::{HashMap, HashSet}; use std::collections::{HashMap, HashSet};
use std::fmt::Write; use std::fmt::{Debug, Display, Formatter, Write};
use std::fmt::{Debug, Display, Formatter};
use std::sync::{Arc, Mutex}; use std::sync::{Arc, Mutex};
use async_trait::async_trait;
use martin_tile_utils::DataFormat;
use tilejson::TileJSON; use tilejson::TileJSON;
use crate::utils::Result;
#[derive(Debug, Copy, Clone)] #[derive(Debug, Copy, Clone)]
pub struct Xyz { pub struct Xyz {
pub z: i32, pub z: i32,
@ -27,6 +28,7 @@ impl Display for Xyz {
pub type Tile = Vec<u8>; pub type Tile = Vec<u8>;
pub type UrlQuery = HashMap<String, String>; pub type UrlQuery = HashMap<String, String>;
pub type Sources = HashMap<String, Box<dyn Source>>;
#[async_trait] #[async_trait]
pub trait Source: Send + Debug { pub trait Source: Send + Debug {

View File

@ -1,4 +1,3 @@
use crate::config::set_option;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
pub const KEEP_ALIVE_DEFAULT: u64 = 75; pub const KEEP_ALIVE_DEFAULT: u64 = 75;
@ -13,12 +12,3 @@ pub struct SrvConfig {
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub worker_processes: Option<usize>, pub worker_processes: Option<usize>,
} }
impl SrvConfig {
pub fn merge(&mut self, other: Self) -> &mut Self {
set_option(&mut self.keep_alive, other.keep_alive);
set_option(&mut self.listen_addresses, other.listen_addresses);
set_option(&mut self.worker_processes, other.worker_processes);
self
}
}

View File

@ -1,2 +1,5 @@
pub mod config; mod config;
pub mod server; mod server;
pub use config::{SrvConfig, KEEP_ALIVE_DEFAULT, LISTEN_ADDRESSES_DEFAULT};
pub use server::{new_server, router, AppState, IndexEntry, RESERVED_KEYWORDS};

View File

@ -1,7 +1,8 @@
use crate::pg::utils::parse_x_rewrite_url; use std::cmp::Ordering;
use crate::source::{Source, UrlQuery, Xyz}; use std::time::Duration;
use crate::srv::config::{SrvConfig, KEEP_ALIVE_DEFAULT, LISTEN_ADDRESSES_DEFAULT};
use actix_cors::Cors; use actix_cors::Cors;
use actix_http::header::HeaderValue;
use actix_web::dev::Server; use actix_web::dev::Server;
use actix_web::http::header::CACHE_CONTROL; use actix_web::http::header::CACHE_CONTROL;
use actix_web::http::Uri; use actix_web::http::Uri;
@ -16,25 +17,24 @@ use itertools::Itertools;
use log::{debug, error}; use log::{debug, error};
use martin_tile_utils::DataFormat; use martin_tile_utils::DataFormat;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use std::cmp::Ordering;
use std::collections::HashMap;
use std::time::Duration;
use tilejson::{TileJSON, VectorLayer}; use tilejson::{TileJSON, VectorLayer};
use crate::source::{Source, Sources, UrlQuery, Xyz};
use crate::srv::config::{SrvConfig, KEEP_ALIVE_DEFAULT, LISTEN_ADDRESSES_DEFAULT};
use crate::Error::BindingError;
/// List of keywords that cannot be used as source IDs. Some of these are reserved for future use. /// List of keywords that cannot be used as source IDs. Some of these are reserved for future use.
/// Reserved keywords must never end in a "dot number" (e.g. ".1") /// Reserved keywords must never end in a "dot number" (e.g. ".1")
pub const RESERVED_KEYWORDS: &[&str] = &[ pub const RESERVED_KEYWORDS: &[&str] = &[
"catalog", "config", "health", "help", "index", "manifest", "refresh", "reload", "status", "catalog", "config", "health", "help", "index", "manifest", "refresh", "reload", "status",
]; ];
pub type Sources = HashMap<String, Box<dyn Source>>;
pub struct AppState { pub struct AppState {
pub sources: Sources, pub sources: Sources,
} }
impl AppState { impl AppState {
pub fn get_source(&self, id: &str) -> Result<&dyn Source> { fn get_source(&self, id: &str) -> Result<&dyn Source> {
Ok(self Ok(self
.sources .sources
.get(id) .get(id)
@ -299,7 +299,8 @@ pub fn router(cfg: &mut web::ServiceConfig) {
.service(get_tile); .service(get_tile);
} }
pub fn new(config: SrvConfig, sources: Sources) -> (Server, String) { /// Create a new initialized Actix `App` instance together with the listening address.
pub fn new_server(config: SrvConfig, sources: Sources) -> crate::Result<(Server, String)> {
let keep_alive = Duration::from_secs(config.keep_alive.unwrap_or(KEEP_ALIVE_DEFAULT)); let keep_alive = Duration::from_secs(config.keep_alive.unwrap_or(KEEP_ALIVE_DEFAULT));
let worker_processes = config.worker_processes.unwrap_or_else(num_cpus::get); let worker_processes = config.worker_processes.unwrap_or_else(num_cpus::get);
let listen_addresses = config let listen_addresses = config
@ -324,19 +325,27 @@ pub fn new(config: SrvConfig, sources: Sources) -> (Server, String) {
.configure(router) .configure(router)
}) })
.bind(listen_addresses.clone()) .bind(listen_addresses.clone())
.unwrap_or_else(|_| panic!("Can't bind to {listen_addresses}")) .map_err(|e| BindingError(e, listen_addresses.clone()))?
.keep_alive(keep_alive) .keep_alive(keep_alive)
.shutdown_timeout(0) .shutdown_timeout(0)
.workers(worker_processes) .workers(worker_processes)
.run(); .run();
(server, listen_addresses) Ok((server, listen_addresses))
} }
pub fn check_zoom(src: &dyn Source, id: &str, zoom: i32) -> bool { fn check_zoom(src: &dyn Source, id: &str, zoom: i32) -> bool {
let is_valid = src.is_valid_zoom(zoom); let is_valid = src.is_valid_zoom(zoom);
if !is_valid { if !is_valid {
debug!("Zoom {zoom} is not valid for source {id}"); debug!("Zoom {zoom} is not valid for source {id}");
} }
is_valid is_valid
} }
fn parse_x_rewrite_url(header: &HeaderValue) -> Option<String> {
header
.to_str()
.ok()
.and_then(|header| header.parse::<Uri>().ok())
.map(|uri| uri.path().to_owned())
}

5
src/utils/mod.rs Normal file
View File

@ -0,0 +1,5 @@
mod one_or_many;
mod utilities;
pub use one_or_many::OneOrMany;
pub use utilities::*;

65
src/utils/one_or_many.rs Normal file
View File

@ -0,0 +1,65 @@
use std::vec::IntoIter;
use serde::{Deserialize, Serialize};
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
#[serde(untagged)]
pub enum OneOrMany<T> {
One(T),
Many(Vec<T>),
}
impl<T> IntoIterator for OneOrMany<T> {
type Item = T;
type IntoIter = IntoIter<T>;
fn into_iter(self) -> Self::IntoIter {
match self {
OneOrMany::One(v) => vec![v].into_iter(),
OneOrMany::Many(v) => v.into_iter(),
}
}
}
impl<T: Clone> OneOrMany<T> {
pub fn is_empty(&self) -> bool {
match self {
OneOrMany::One(_) => false,
OneOrMany::Many(v) => v.is_empty(),
}
}
pub fn iter_mut(&mut self) -> impl Iterator<Item = &mut T> {
match self {
OneOrMany::Many(v) => v.iter_mut(),
OneOrMany::One(v) => std::slice::from_mut(v).iter_mut(),
}
}
pub fn as_slice(&self) -> &[T] {
match self {
OneOrMany::One(item) => std::slice::from_ref(item),
OneOrMany::Many(v) => v.as_slice(),
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_one_or_many() {
let mut one = OneOrMany::One(1);
let mut many = OneOrMany::Many(vec![1, 2, 3]);
assert_eq!(one.iter_mut().collect::<Vec<_>>(), vec![&1]);
assert_eq!(many.iter_mut().collect::<Vec<_>>(), vec![&1, &2, &3]);
assert_eq!(one.as_slice(), &[1]);
assert_eq!(many.as_slice(), &[1, 2, 3]);
assert_eq!(one.into_iter().collect::<Vec<_>>(), vec![1]);
assert_eq!(many.into_iter().collect::<Vec<_>>(), vec![1, 2, 3]);
}
}

57
src/utils/test_utils.rs Normal file
View File

@ -0,0 +1,57 @@
// This file is included from multiple projects, so we need to make sure
// that `crate::Env` is always available, both when it is part of the lib or external to the test.
use std::ffi::OsString;
use crate::Env;
#[allow(clippy::unnecessary_wraps)]
#[must_use]
pub fn some(s: &str) -> Option<String> {
Some(s.to_string())
}
#[allow(clippy::unnecessary_wraps)]
#[must_use]
pub fn os(s: &str) -> OsString {
OsString::from(s)
}
#[derive(Default)]
pub struct FauxEnv(pub std::collections::HashMap<&'static str, OsString>);
impl Env for FauxEnv {
fn var_os(&self, key: &str) -> Option<OsString> {
self.0.get(key).map(Into::into)
}
fn has_unused_var(&self, key: &str) -> bool {
self.var_os(key).is_some()
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_get_env_str() {
let env = FauxEnv::default();
assert_eq!(env.get_env_str("FOO"), None);
let env = FauxEnv(vec![("FOO", os("bar"))].into_iter().collect());
assert_eq!(env.get_env_str("FOO"), some("bar"));
}
#[test]
#[cfg(unix)]
fn test_bad_os_str() {
use std::ffi::OsStr;
use std::os::unix::ffi::OsStrExt;
let bad_utf8 = [0x66, 0x6f, 0x80, 0x6f];
let os_str = OsStr::from_bytes(&bad_utf8[..]);
let env = FauxEnv(vec![("BAD", os_str.to_owned())].into_iter().collect());
assert!(env.0.contains_key("BAD"));
assert_eq!(env.get_env_str("BAD"), None);
}
}

View File

@ -1,11 +1,11 @@
use crate::pg::utils::PgError;
use itertools::Itertools;
use log::{error, info, warn};
use serde::{Deserialize, Serialize};
use std::collections::HashMap; use std::collections::HashMap;
use std::io; use std::io;
use std::path::PathBuf; use std::path::PathBuf;
use log::{error, info, warn};
use crate::pg::PgError;
pub type InfoMap<T> = HashMap<String, T>; pub type InfoMap<T> = HashMap<String, T>;
#[derive(thiserror::Error, Debug)] #[derive(thiserror::Error, Debug)]
@ -13,6 +13,9 @@ pub enum Error {
#[error("The --config and the connection parameters cannot be used together")] #[error("The --config and the connection parameters cannot be used together")]
ConfigAndConnectionsError, ConfigAndConnectionsError,
#[error("Unable to bind to {1}: {0}")]
BindingError(io::Error, String),
#[error("Unable to load config file {}: {0}", .1.display())] #[error("Unable to load config file {}: {0}", .1.display())]
ConfigLoadError(io::Error, PathBuf), ConfigLoadError(io::Error, PathBuf),
@ -22,6 +25,12 @@ pub enum Error {
#[error("Unable to write config file {}: {0}", .1.display())] #[error("Unable to write config file {}: {0}", .1.display())]
ConfigWriteError(io::Error, PathBuf), ConfigWriteError(io::Error, PathBuf),
#[error("No tile sources found. Set sources by giving a database connection string on command line, env variable, or a config file.")]
NoSources,
#[error("Unrecognizable connection strings: {0:?}")]
UnrecognizableConnections(Vec<String>),
#[error("{0}")] #[error("{0}")]
PostgresError(#[from] PgError), PostgresError(#[from] PgError),
} }
@ -44,7 +53,7 @@ pub fn find_info<'a, T>(map: &'a InfoMap<T>, key: &'a str, info: &str, id: &str)
} }
#[must_use] #[must_use]
pub fn find_info_kv<'a, T>( fn find_info_kv<'a, T>(
map: &'a InfoMap<T>, map: &'a InfoMap<T>,
key: &'a str, key: &'a str,
info: &str, info: &str,
@ -72,7 +81,7 @@ pub fn find_info_kv<'a, T>(
if multiple.is_empty() { if multiple.is_empty() {
if let Some(result) = result { if let Some(result) = result {
info!("For source {id}, {info} '{key}' was not found, using '{result}' instead."); info!("For source {id}, {info} '{key}' was not found, but found '{result}' instead.");
Some((result.as_str(), map.get(result)?)) Some((result.as_str(), map.get(result)?))
} else { } else {
warn!("Unable to configure source {id} because {info} '{key}' was not found. Possible values are: {}", warn!("Unable to configure source {id} because {info} '{key}' was not found. Possible values are: {}",
@ -80,37 +89,14 @@ pub fn find_info_kv<'a, T>(
None None
} }
} else { } else {
error!("Unable to configure source {id} because {info} '{key}' has no exact match and more than one potential matches: {}", multiple.join(", ")); error!("Unable to configure source {id} because {info} '{key}' has no exact match and more than one potential matches: {}",
multiple.join(", "));
None None
} }
} }
/// A list of schemas to include in the discovery process, or a boolean to #[must_use]
/// indicate whether to run discovery at all. pub fn is_valid_zoom(zoom: i32, minzoom: Option<u8>, maxzoom: Option<u8>) -> bool {
#[derive(Clone, Debug, PartialEq, Serialize, Deserialize)] minzoom.map_or(true, |minzoom| zoom >= minzoom.into())
#[serde(untagged)] && maxzoom.map_or(true, |maxzoom| zoom <= maxzoom.into())
pub enum Schemas {
Bool(bool),
List(Vec<String>),
}
impl Schemas {
/// Returns a list of schemas to include in the discovery process.
/// If self is a true, returns a list of all schemas produced by the callback.
pub fn get<'a, I, F>(&self, keys: F) -> Vec<String>
where
I: Iterator<Item = &'a String>,
F: FnOnce() -> I,
{
match self {
Schemas::List(lst) => lst.clone(),
Schemas::Bool(all) => {
if *all {
keys().sorted().map(String::to_string).collect()
} else {
Vec::new()
}
}
}
}
} }

View File

@ -0,0 +1,18 @@
DROP FUNCTION IF EXISTS public.function_zxy_query_jsonb;
CREATE OR REPLACE FUNCTION public.function_zxy_query_jsonb(z integer, x integer, y integer, query jsonb) RETURNS bytea AS $$
DECLARE
mvt bytea;
BEGIN
RAISE NOTICE 'query: %', query;
SELECT INTO mvt ST_AsMVT(tile, 'public.function_zxy_query_jsonb', 4096, 'geom') FROM (
SELECT
ST_AsMVTGeom(ST_Transform(ST_CurveToLine(geom), 3857), ST_TileEnvelope(z, x, y), 4096, 64, true) AS geom
FROM public.table_source
WHERE geom && ST_Transform(ST_TileEnvelope(z, x, y), 4326)
) as tile WHERE geom IS NOT NULL;
RETURN mvt;
END
$$ LANGUAGE plpgsql IMMUTABLE STRICT PARALLEL SAFE;

View File

@ -2,7 +2,10 @@
set -euo pipefail set -euo pipefail
FIXTURES_DIR="$(dirname "$0")" FIXTURES_DIR="$(dirname "$0")"
echo -e "\n\n\n"
echo "################################################################################################"
echo "Loading Martin test fixtures into '$PGDATABASE' as user '$PGUSER'" echo "Loading Martin test fixtures into '$PGDATABASE' as user '$PGUSER'"
echo "################################################################################################"
psql -P pager=off -v ON_ERROR_STOP=1 -c "CREATE EXTENSION IF NOT EXISTS postgis;" psql -P pager=off -v ON_ERROR_STOP=1 -c "CREATE EXTENSION IF NOT EXISTS postgis;"
@ -12,12 +15,18 @@ psql -P pager=off -v ON_ERROR_STOP=1 -t -c "select version();"
psql -P pager=off -v ON_ERROR_STOP=1 -t -c "select PostGIS_Full_Version();" psql -P pager=off -v ON_ERROR_STOP=1 -t -c "select PostGIS_Full_Version();"
echo -e "\n\n\n"
echo "################################################################################################"
echo "Importing tables from $FIXTURES_DIR/tables" echo "Importing tables from $FIXTURES_DIR/tables"
echo "################################################################################################"
for sql_file in "$FIXTURES_DIR"/tables/*.sql; do for sql_file in "$FIXTURES_DIR"/tables/*.sql; do
psql -e -P pager=off -v ON_ERROR_STOP=1 -f "$sql_file" psql -e -P pager=off -v ON_ERROR_STOP=1 -f "$sql_file"
done done
echo -e "\n\n\n"
echo "################################################################################################"
echo "Importing functions from $FIXTURES_DIR/functions" echo "Importing functions from $FIXTURES_DIR/functions"
echo "################################################################################################"
for sql_file in "$FIXTURES_DIR"/functions/*.sql; do for sql_file in "$FIXTURES_DIR"/functions/*.sql; do
psql -e -P pager=off -v ON_ERROR_STOP=1 -f "$sql_file" psql -e -P pager=off -v ON_ERROR_STOP=1 -f "$sql_file"
done done

View File

@ -1,14 +1,52 @@
DROP SCHEMA IF EXISTS "MixedCase" CASCADE; DROP SCHEMA IF EXISTS "MixedCase" CASCADE;
CREATE SCHEMA "MixedCase"; CREATE SCHEMA "MixedCase";
CREATE TABLE "MixedCase"."MixPoints"("Gid" SERIAL PRIMARY KEY, "TABLE" TEXT, "Geom" GEOMETRY(POINT, 4326)); CREATE TABLE "MixedCase"."MixPoints"
(
"Gid" SERIAL PRIMARY KEY,
"TABLE" TEXT,
"Geom" GEOMETRY(POINT, 4326)
);
-- INSERT INTO "MixedCase"."MixPoints"
-- SELECT generate_series(1, 3) as id,
-- md5(random()::text) as "TABLE",
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(18, 235085, 122323), 4326), 3))).Geom;
-- INSERT INTO "MixedCase"."MixPoints"
-- SELECT generate_series(4, 30) as id,
-- md5(random()::text) as "TABLE",
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(0, 0, 0), 4326), 27))).Geom;
INSERT INTO "MixedCase"."MixPoints" INSERT INTO "MixedCase"."MixPoints"
SELECT values (1, '02daedc70702ec68753fde38351f5d9d', '0101000020E610000050C4D38CE9DA61401EFC0EC7C3DA2740'),
generate_series(1, 10000) as id, (2, '7418427ba8a960c3661235f47cc13d46', '0101000020E6100000CC2F4170E9DA6140DEDB02B581DA2740'),
md5(random()::text) as "TABLE", (3, 'd5a11dee7203a09442168eec74c7bea8', '0101000020E6100000008E66E9E6DA614059944356B4DA2740'),
( (4, '2368bbc7ba9dcb274f5465ef10ffad1f', '0101000020E6100000B43E295A4CEE6140265634327FFB52C0'),
ST_DUMP(ST_GENERATEPOINTS(ST_GEOMFROMTEXT('POLYGON ((-180 90, 180 90, 180 -90, -180 -90, -180 90))', 4326), 10000)) (5, '140cf506fdf19e0cd451bc0da0ad8b50', '0101000020E610000016551B51B0B033407C3AE7BBE91B3140'),
).Geom; (6, 'e8d7e0e5b421079203c2f1a84f62d029', '0101000020E61000007CD7F65C2360604055855E6358954F40'),
(7, 'eeea13624e9c7ba34ad7210498061fd9', '0101000020E6100000B5E96FF565874D40328E73C500A951C0'),
(8, '32b066ccc705875a6ba04a4f8fe6ef26', '0101000020E61000002AAF4124655E65C06C3CC08BDE884040'),
(9, '7c304793df1ff378d775106b31a14bea', '0101000020E6100000D0CAD2D7A9790DC000065E0B160843C0'),
(10, 'b936821caa8237e331f26ddf5165784b', '0101000020E6100000CA5016BD8E9563403E9A0372E7932E40'),
(11, '434749fa23d9302d475f7ec190981958', '0101000020E61000004AA2B720B23E45C0E94EBCDB72014740'),
(12, 'fb78b6759036417511bc13e47bc25db8', '0101000020E6100000A35AEF6470684B4006C609806BC74440'),
(13, '730b7f416d91573e5a5d4c32673c716e', '0101000020E61000003BF842670F9B484030FA0AA450DE4D40'),
(14, 'e51f27140b07abdf60b6b0e86271446d', '0101000020E6100000FC54A712989843C0664EB161D4D943C0'),
(15, '1128b472f9ce87958e2b941f732bde55', '0101000020E6100000DBDDCAA1D80B63C0E84F2B8BC8C63DC0'),
(16, 'ff2d28a9b608cb6ef29751c1b7cefc8b', '0101000020E610000082EA2075B2D26440A2B180EAFCEF52C0'),
(17, '6e0d72a4b999f6f993a86af936fde899', '0101000020E610000028E151D6194825C0FD73E0FC5B8615C0'),
(18, '23afce20fa2dd8d8d1f93014447fdba6', '0101000020E6100000B3376FB629D660C017B1393F168F5240'),
(19, '38cb097c70d2ff71e8c8c02855f04166', '0101000020E6100000F1FCE46A01865540EAE8C01038D852C0'),
(20, 'b82d2222d84deecd38a6187a86fd3514', '0101000020E61000005C4A75FF750661C08012B03D84A5EE3F'),
(21, '9efc50c9da5f0da5040c565b2ba838ce', '0101000020E61000008037CA00BD693E4018F8D89279004FC0'),
(22, 'a2dbb89488297ad2c6af9460980479a3', '0101000020E610000092D0FE8AAFF664401EE866F4AF5D3B40'),
(23, '09e3dc819cfd6344bce527be0ef29086', '0101000020E6100000A6235C70F6C053C0C0E86095B8AA0940'),
(24, 'fd59276e15c0577881118df65e3b2b9a', '0101000020E610000078B4CD86D3444240FF879F9C924B4840'),
(25, 'a8a47755660da683c7817634797515e8', '0101000020E6100000B2E72AE85C0143C04487454A6F1F4FC0'),
(26, 'b44bf3139cc2bab31a48b165f63dfaa3', '0101000020E61000008224AB2C6A3364C00C1DD30085CF32C0'),
(27, '48b2e0ae68663d5dc003f20e2cc9dba1', '0101000020E6100000981F49E883D45B405CE9B4808E2637C0'),
(28, '5e27d8b2cbee33e3196aae5e5ec15db2', '0101000020E61000001036BD0CF11F1440600218267D833740'),
(29, 'fd0775c59700ac8c1982aa3efe6cb0c7', '0101000020E6100000D6CF48A3E1A9464077D6BBFDD00C55C0'),
(30, '404175d17b08782edc9d316c378adc86', '0101000020E6100000F9B5A5ADB7265BC0EE07F81F2F284840');
CREATE INDEX ON "MixedCase"."MixPoints" USING GIST ("Geom"); CREATE INDEX ON "MixedCase"."MixPoints" USING GIST ("Geom");

View File

@ -1,16 +1,47 @@
CREATE TABLE points1(gid SERIAL PRIMARY KEY, geom GEOMETRY(POINT, 4326)); CREATE TABLE points1
(
gid SERIAL PRIMARY KEY,
geom GEOMETRY(POINT, 4326)
);
-- INSERT INTO points1
-- SELECT generate_series(1, 3) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(18, 235085, 122323), 4326), 3))).geom;
-- INSERT INTO points1
-- SELECT generate_series(4, 30) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(0, 0, 0), 4326), 27))).geom;
INSERT INTO points1 INSERT INTO points1
SELECT values (1, '0101000020E6100000EC3A2806EDDA61401C2041E87DDA2740'),
generate_series(1, 10000) as id, (2, '0101000020E61000005DDA9603E9DA614070BB4C49D0DA2740'),
( (3, '0101000020E6100000C975C49BE4DA61405E2616FDD1DA2740'),
ST_DUMP( (4, '0101000020E61000005947D7F5EF045FC0560BE226301A4BC0'),
ST_GENERATEPOINTS( (5, '0101000020E6100000776DF612E1BC65C0CE28B075BB805440'),
ST_GEOMFROMTEXT('POLYGON ((-180 90, 180 90, 180 -90, -180 -90, -180 90))', 4326), (6, '0101000020E6100000D1188AF5BB166340F69C7E0388A14340'),
10000 (7, '0101000020E61000005051CFB7BF4563406F6D5E62B6145340'),
) (8, '0101000020E6100000E101F56A99164940960D11FF91024540'),
) (9, '0101000020E6100000E18D788FBD6866C058FCD51D83923140'),
).geom; (10, '0101000020E6100000E23AE326D47B6140023F70AA32CF4EC0'),
(11, '0101000020E6100000B63649F4E210544024CC8D72539732C0'),
(12, '0101000020E6100000628B27A58F3E3740B0B989B6742D0F40'),
(13, '0101000020E610000010DE41442D603940D0CD3A1C703646C0'),
(14, '0101000020E61000004FC688AD360D4AC01870AA442B7E42C0'),
(15, '0101000020E610000097316B3BD80D5AC004FAD27255E83340'),
(16, '0101000020E610000044A5AD304AD24BC0BD3C7835943B5540'),
(17, '0101000020E61000003A184905AF0A4F4010BF00583A1E5140'),
(18, '0101000020E61000009B30264A61185CC05A2327A3A8EE4BC0'),
(19, '0101000020E6100000EC7FFEA7C6866340BAF66508201A21C0'),
(20, '0101000020E610000026156EA3E9C94E4028CE0241ECC03C40'),
(21, '0101000020E610000041ED7EBCDAF665C0C8B67BDB424FF63F'),
(22, '0101000020E6100000E89B8CD0F3896040D2AABB491A954FC0'),
(23, '0101000020E61000003B7E4B1CC486474060EBF0EDF1863DC0'),
(24, '0101000020E61000009CC12D9B329037406A6264529E143640'),
(25, '0101000020E61000003C6231872D1A3CC0C0F5391D889247C0'),
(26, '0101000020E61000000C4A2739273850C0B42533A49CE150C0'),
(27, '0101000020E610000054990A64657F4DC0E459C5B3933D05C0'),
(28, '0101000020E61000002FE1184680AE64C07D34C584D40049C0'),
(29, '0101000020E61000006046EECC3C536440D410042DE5D04A40'),
(30, '0101000020E61000000FFC00A790165040AA1B2B5EB01A2A40');
CREATE INDEX ON points1 USING GIST (geom); CREATE INDEX ON points1 USING GIST (geom);
CLUSTER points1_geom_idx ON points1; CLUSTER points1_geom_idx ON points1;

View File

@ -1,16 +1,47 @@
CREATE TABLE points2(gid SERIAL PRIMARY KEY, geom GEOMETRY(POINT, 4326)); CREATE TABLE points2
(
gid SERIAL PRIMARY KEY,
geom GEOMETRY(POINT, 4326)
);
-- INSERT INTO points2
-- SELECT generate_series(1, 3) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(18, 235085, 122323), 4326), 3))).geom;
-- INSERT INTO points2
-- SELECT generate_series(4, 30) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(0, 0, 0), 4326), 27))).geom;
INSERT INTO points2 INSERT INTO points2
SELECT values (1, '0101000020E6100000C8B87C3FE5DA614032D27209ECDA2740'),
generate_series(1, 10000) as id, (2, '0101000020E6100000EF84EC96E8DA6140039B96DD6ADA2740'),
( (3, '0101000020E61000009172473AEADA614003D6D83BF0DA2740'),
ST_DUMP( (4, '0101000020E61000004AB207C657CC59C0583D8A99C35324C0'),
ST_GENERATEPOINTS( (5, '0101000020E61000003EB2523EE2D64CC0065C8FAB1D165340'),
ST_GEOMFROMTEXT('POLYGON ((-180 90, 180 90, 180 -90, -180 -90, -180 90))', 4326), (6, '0101000020E6100000884A93598E1E634095CDEC2CC2924340'),
10000 (7, '0101000020E61000005072EC2CB50954C053DDA0E4F3A24DC0'),
) (8, '0101000020E610000002C8849A96EA61C0C2F360AA8D1D54C0'),
) (9, '0101000020E61000002804EB9F63DD5B4038F144E527B434C0'),
).geom; (10, '0101000020E610000000AC1A91908E4840E20DEDEC319337C0'),
(11, '0101000020E61000001B98DDBF963232C0388FFD8CE1762AC0'),
(12, '0101000020E6100000921693D3F7555EC0705ADDACECEE3240'),
(13, '0101000020E61000001845FE176B031640A4CF0AEB2CA605C0'),
(14, '0101000020E61000001B402E3D15B54540985AAE40CBA4FEBF'),
(15, '0101000020E610000070A58239111952C0787B42BAB4E723C0'),
(16, '0101000020E6100000A061B652FF6CFB3F33BAB22F5D485440'),
(17, '0101000020E610000032080D36EBDE63408AED619E22522F40'),
(18, '0101000020E61000000DF5DFCD3A4B4DC07AA218AA798350C0'),
(19, '0101000020E6100000EED1B438549962C092B2ECDF100041C0'),
(20, '0101000020E6100000962031749463664068AB6A74DEDA52C0'),
(21, '0101000020E6100000C32AD2DAB5755540FCE27D02A0C134C0'),
(22, '0101000020E6100000DA915E30698E46400EA1EDE7CD5E5040'),
(23, '0101000020E610000076DE962776282EC021C892CCA67549C0'),
(24, '0101000020E6100000169401C09FC165C0FF58934CFCE225C0'),
(25, '0101000020E610000032EC6B07BBD83A40025879684C523C40'),
(26, '0101000020E6100000E9DC67BC5E935840951C8BB9074928C0'),
(27, '0101000020E6100000B73BE9BE2C5358C05669236D996722C0'),
(28, '0101000020E6100000903882FD245064403F052FF70C8A4F40'),
(29, '0101000020E6100000CA5D434B9C8F53C002D9B561D3124E40'),
(30, '0101000020E61000001076C2DDC6956540B0E88CCB964D2A40');
CREATE INDEX ON points2 USING GIST (geom); CREATE INDEX ON points2 USING GIST (geom);
CLUSTER points2_geom_idx ON points2; CLUSTER points2_geom_idx ON points2;

View File

@ -1,19 +1,47 @@
CREATE TABLE points3857(gid SERIAL PRIMARY KEY, geom GEOMETRY(POINT, 3857)); CREATE TABLE points3857
(
gid SERIAL PRIMARY KEY,
geom GEOMETRY(POINT, 3857)
);
-- INSERT INTO points3857
-- SELECT generate_series(1, 3) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_tileenvelope(18, 235085, 122323), 3))).geom;
-- INSERT INTO points3857
-- SELECT generate_series(4, 30) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_tileenvelope(0, 0, 0), 27))).geom;
INSERT INTO points3857 INSERT INTO points3857
SELECT values (1, '0101000020110F0000208AFF3226546E41C84F65383E683441'),
generate_series(1, 10000) as id, (2, '0101000020110F00002C8A109F1B546E41A2E6C2B64E683441'),
( (3, '0101000020110F0000EE7AA3D31B546E41B166FE6638683441'),
ST_DUMP( (4, '0101000020110F00009A480E9917B96C41922C7B04914562C1'),
ST_GENERATEPOINTS( (5, '0101000020110F000078FF726538CB5E41152FFC1AF27E57C1'),
ST_TRANSFORM( (6, '0101000020110F0000304C78D340B770C119ADCD8AD6445441'),
ST_GEOMFROMTEXT('POLYGON ((-179 89, 179 89, 179 -89, -179 -89, -179 89))', 4326), (7, '0101000020110F00004C86889FCB9B3A41F0AB6D9D635B5FC1'),
3857 (8, '0101000020110F0000FE84FE2F1F9F6CC1DC325036FF286BC1'),
), (9, '0101000020110F0000EC8E35B4ECDC68C11CBB8EC86D923241'),
10000 (10, '0101000020110F00009B46FD7EFFD15A411418509C299D4741'),
) (11, '0101000020110F0000A20E4CBA1469484119086B669D466741'),
) (12, '0101000020110F00004A55251576586C41AC274E3A89EB5741'),
).geom; (13, '0101000020110F00008B8B5442A8CC58C1E0100928BF9753C1'),
(14, '0101000020110F00007C3FF5E2F6AD65C1C09A48A26FFEE7C0'),
(15, '0101000020110F00001E5DCA1281AB674130CB88B4D37C0FC1'),
(16, '0101000020110F0000402E3515AC216AC168F17ABCA9286941'),
(17, '0101000020110F0000A619450A8F5072414F62F95DB65F7141'),
(18, '0101000020110F00006EBF8AE0243B5AC165CFBBF145A26FC1'),
(19, '0101000020110F000054D8A8407ECA5DC11470E8DAE9696141'),
(20, '0101000020110F0000D367655933744DC1EEFBC3B2B7276BC1'),
(21, '0101000020110F0000B65DE9B69E454041ECAC82B2B0AC3441'),
(22, '0101000020110F0000046B677462BA714136A3D1753D1667C1'),
(23, '0101000020110F0000DE4E3D79E50158C14DC7142F5F307241'),
(24, '0101000020110F0000284EAADA8EF03041468A8FB7E7CC6541'),
(25, '0101000020110F00008276EA59054F6241727B468F5BE26E41'),
(26, '0101000020110F0000A1FF6E77A02271C169E29727FD3351C1'),
(27, '0101000020110F00003F5D8F7E2BB05441224CC4A8D1A96541'),
(28, '0101000020110F0000E479ACB3ABD05041D886ECFAF5CF6BC1'),
(29, '0101000020110F00001386BD74F42E724112A10AF19ADA60C1'),
(30, '0101000020110F00009E4B1FD4C345574120DFFEC70B0A51C1');
CREATE INDEX ON points3857 USING GIST (geom); CREATE INDEX ON points3857 USING GIST (geom);
CLUSTER points3857_geom_idx ON points3857; CLUSTER points3857_geom_idx ON points3857;

View File

@ -1,19 +1,47 @@
CREATE TABLE points_empty_srid(gid SERIAL PRIMARY KEY, geom GEOMETRY); CREATE TABLE points_empty_srid
(
gid SERIAL PRIMARY KEY,
geom GEOMETRY
);
-- INSERT INTO points_empty_srid
-- SELECT generate_series(1, 3) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(ST_TRANSFORM(st_tileenvelope(18, 235085, 122323), 900913), 3))).geom;
-- INSERT INTO points_empty_srid
-- SELECT generate_series(4, 30) as id,
-- (ST_DUMP(ST_GENERATEPOINTS(ST_TRANSFORM(st_tileenvelope(0, 0, 0), 900913), 27))).geom;
INSERT INTO points_empty_srid INSERT INTO points_empty_srid
SELECT values (1, '010100002031BF0D00A893BD242C546E4114573D7189453441'),
generate_series(1, 10000) as id, (2, '010100002031BF0D008F47208524546E419F1F5DF118463441'),
( (3, '010100002031BF0D00122679C128546E41DF077C57EB453441'),
ST_DUMP( (4, '010100002031BF0D00502800BEFF687041B7FA65EF000E71C1'),
ST_GENERATEPOINTS( (5, '010100002031BF0D00283DA444BE53354176887245EB6066C1'),
ST_TRANSFORM( (6, '010100002031BF0D00679A07B56C4B504168216B067DD65DC1'),
ST_GEOMFROMTEXT('POLYGON ((-179 89, 179 89, 179 -89, -179 -89, -179 89))', 4326), (7, '010100002031BF0D00B6D9B8B1B70A68C11D3E2837B86C72C1'),
900913 (8, '010100002031BF0D00BEAFAF46A56B5241523D071D05D96241'),
), (9, '010100002031BF0D0084239A093C4A70417CC6E3A2C8C53B41'),
10000 (10, '010100002031BF0D004475363F9C6B61C1ACAEC94206C950C1'),
) (11, '010100002031BF0D00B61E5FA4563C71C1166F2110C18E6241'),
) (12, '010100002031BF0D0036BADF2EB3EB56C1F8E5F8E651E971C1'),
).geom; (13, '010100002031BF0D00C48A851B07CE69C1639F032C33EC64C1'),
(14, '010100002031BF0D00228AE2D877F272417307AA3AEF8757C1'),
(15, '010100002031BF0D004149C981DB206B4173A89BBB098E6841'),
(16, '010100002031BF0D00B4DB37CFDA3149C122DBF542798E6B41'),
(17, '010100002031BF0D00DE8CB588496A50410F332B90ECED68C1'),
(18, '010100002031BF0D00CEAA1162393E59416AD4838434637041'),
(19, '010100002031BF0D00C0700EC5080A3141401CDE1EAA703F41'),
(20, '010100002031BF0D00542CA763BFE33BC19D52D6EA59BB6441'),
(21, '010100002031BF0D007FB0FCE2289B5F41F6BB98F8F00B4641'),
(22, '010100002031BF0D0051FF66E42ADD57C1BF7B765208154AC1'),
(23, '010100002031BF0D001C164D29EA2659C190BE2AA8514D6841'),
(24, '010100002031BF0D00541CC23D08883841960C8F0EBF4E6BC1'),
(25, '010100002031BF0D00409FC9B8D50867C10058FAE36ED01941'),
(26, '010100002031BF0D006375CA2B561E6741C3DBA6C58AB64F41'),
(27, '010100002031BF0D00E4F260A533D250C1A66FD71A76956041'),
(28, '010100002031BF0D00005C10F52F3271C1D1D701BD32B37041'),
(29, '010100002031BF0D00488317ADC4177041E539DBF991A270C1'),
(30, '010100002031BF0D00E2A0DFFAED4440C15A88E23068CF5EC1');
CREATE INDEX ON points_empty_srid USING GIST (geom); CREATE INDEX ON points_empty_srid USING GIST (geom);
CLUSTER points_empty_srid_geom_idx ON points_empty_srid; CLUSTER points_empty_srid_geom_idx ON points_empty_srid;

View File

@ -21,10 +21,13 @@ INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;COMPOUNDCURVE(CIR
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;CURVEPOLYGON(CIRCULARSTRING(-2 0,-1 -1,0 0,1 -1,2 0,0 2,-2 0),(-1 0,0 0.5,1 0,0 1,-1 0))')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;CURVEPOLYGON(CIRCULARSTRING(-2 0,-1 -1,0 0,1 -1,2 0,0 2,-2 0),(-1 0,0 0.5,1 0,0 1,-1 0))'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;MULTICURVE((5 5,3 5,3 3,0 3),CIRCULARSTRING(0 0,2 1,2 2))')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;MULTICURVE((5 5,3 5,3 3,0 3),CIRCULARSTRING(0 0,2 1,2 2))'));
-- Moscow INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84124343269863 11.927545216212339)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(37.617222 55.755833)')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84022627741408 11.926919775099435)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(37.599983 55.720154)')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84116724279622 11.926986082398354)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(37.629691 55.732225)')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84129834730146 11.926483025982757)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(37.652966 55.764475)')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84086326293937 11.92741281580712)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(37.634416 55.758747)')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84083973422645 11.927188724740008)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(37.633562 55.763012)')); INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.8407405154705 11.92659842381238)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84029057105903 11.92711170365923)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.8403402985401 11.927568375227375)'));
INSERT INTO table_source(geom) values (GeomFromEWKT('SRID=4326;POINT(142.84131509869133 11.92781306544329)'));

View File

@ -1,14 +1,50 @@
CREATE TABLE table_source_multiple_geom ( CREATE TABLE table_source_multiple_geom
(
gid serial PRIMARY KEY, gid serial PRIMARY KEY,
geom1 GEOMETRY(point, 4326), geom1 GEOMETRY(point, 4326),
geom2 GEOMETRY(point, 4326) geom2 GEOMETRY(point, 4326)
); );
-- INSERT INTO table_source_multiple_geom
-- SELECT generate_series(1, 3) AS id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(18, 235085, 122323), 4326), 3))).geom,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(18, 235085, 122323), 4326), 3))).geom;
-- INSERT INTO table_source_multiple_geom
-- SELECT generate_series(4, 30) AS id,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(0, 0, 0), 4326), 27))).geom,
-- (ST_DUMP(ST_GENERATEPOINTS(st_transform(st_tileenvelope(0, 0, 0), 4326), 27))).geom;
INSERT INTO table_source_multiple_geom INSERT INTO table_source_multiple_geom
SELECT values (1, '0101000020E61000006F93CE44E2DA61405A7E3A8EE2DA2740', '0101000020E61000006F93CE44E2DA61405A7E3A8EE2DA2740'),
generate_series(1, 10000) AS id, (2, '0101000020E61000002A2F4384ECDA61404D19BE2EF1DA2740', '0101000020E61000002A2F4384ECDA61404D19BE2EF1DA2740'),
(ST_DUMP (ST_GENERATEPOINTS (ST_GEOMFROMTEXT ('POLYGON ((-180 90, 180 90, 180 -90, -180 -90, -180 90))', 4326), 10000))).geom, (3, '0101000020E610000011745334EADA614006EBE436B5DA2740', '0101000020E610000011745334EADA614006EBE436B5DA2740'),
(ST_DUMP (ST_GENERATEPOINTS (ST_GEOMFROMTEXT ('POLYGON ((-180 90, 180 90, 180 -90, -180 -90, -180 90))', 4326), 10000))).geom; (4, '0101000020E610000082B2053A76EA3DC0CF905E21719553C0', '0101000020E610000082B2053A76EA3DC0CF905E21719553C0'),
(5, '0101000020E610000095A67DF1129946C042A23F4C63F25240', '0101000020E610000095A67DF1129946C042A23F4C63F25240'),
(6, '0101000020E6100000DAEE328840DA52C0AEDD2E46A9663040', '0101000020E6100000DAEE328840DA52C0AEDD2E46A9663040'),
(7, '0101000020E6100000FE870BA3A5D96040BE6ACC00CF383E40', '0101000020E6100000FE870BA3A5D96040BE6ACC00CF383E40'),
(8, '0101000020E6100000257A079F662F4AC04A7A9A879DE351C0', '0101000020E6100000257A079F662F4AC04A7A9A879DE351C0'),
(9, '0101000020E6100000426FE996B38C32C0A163CDE8A57C28C0', '0101000020E6100000426FE996B38C32C0A163CDE8A57C28C0'),
(10, '0101000020E6100000B8BC2706F2565CC028385605BD6D4040', '0101000020E6100000B8BC2706F2565CC028385605BD6D4040'),
(11, '0101000020E6100000C2CEB98764AE4D406B21F6B41AE652C0', '0101000020E6100000C2CEB98764AE4D406B21F6B41AE652C0'),
(12, '0101000020E6100000F0301E45DD1361C0CE9C21592FBF48C0', '0101000020E6100000F0301E45DD1361C0CE9C21592FBF48C0'),
(13, '0101000020E6100000E133F06190C26240A6A97C4CDFDB44C0', '0101000020E6100000E133F06190C26240A6A97C4CDFDB44C0'),
(14, '0101000020E610000006B54CC037285040708D4853BC86FA3F', '0101000020E610000006B54CC037285040708D4853BC86FA3F'),
(15, '0101000020E6100000D292D622E60920C0C09775BA2EC130C0', '0101000020E6100000D292D622E60920C0C09775BA2EC130C0'),
(16, '0101000020E6100000ACDEC2CDCA7C53C01AFECDA6747C4F40', '0101000020E6100000ACDEC2CDCA7C53C01AFECDA6747C4F40'),
(17, '0101000020E61000006AE4BCA506B051C004C6DE0FDBC52740', '0101000020E61000006AE4BCA506B051C004C6DE0FDBC52740'),
(18, '0101000020E6100000E699E9B1D8F161406112C175D8414040', '0101000020E6100000E699E9B1D8F161406112C175D8414040'),
(19, '0101000020E61000007E6DADFABC9857C0D0812ACB1E5747C0', '0101000020E61000007E6DADFABC9857C0D0812ACB1E5747C0'),
(20, '0101000020E6100000465DFCD8EF6321C02E7E6BEA98604B40', '0101000020E6100000465DFCD8EF6321C02E7E6BEA98604B40'),
(21, '0101000020E61000007155AF370E0E5EC068A4E696CCE92F40', '0101000020E61000007155AF370E0E5EC068A4E696CCE92F40'),
(22, '0101000020E6100000C487FE4DA8DC5540E86D4FB3ECBD3EC0', '0101000020E6100000C487FE4DA8DC5540E86D4FB3ECBD3EC0'),
(23, '0101000020E61000008E1778C16B485BC0D40CECB9907339C0', '0101000020E61000008E1778C16B485BC0D40CECB9907339C0'),
(24, '0101000020E61000003CCEB2F79CE76140EB9D6088B2C043C0', '0101000020E61000003CCEB2F79CE76140EB9D6088B2C043C0'),
(25, '0101000020E610000068626BF0544155407C0037D493155140', '0101000020E610000068626BF0544155407C0037D493155140'),
(26, '0101000020E61000007D7E723479BB5440B77145DA30103040', '0101000020E61000007D7E723479BB5440B77145DA30103040'),
(27, '0101000020E610000004A043CFC95930C0906F9082408F3EC0', '0101000020E610000004A043CFC95930C0906F9082408F3EC0'),
(28, '0101000020E61000003978F2A954A055C0746C74D1F6672940', '0101000020E61000003978F2A954A055C0746C74D1F6672940'),
(29, '0101000020E6100000807E70F20043EA3FC8F520E2B0425140', '0101000020E6100000807E70F20043EA3FC8F520E2B0425140'),
(30, '0101000020E610000099F640E9031266405CC296F14F363440', '0101000020E610000099F640E9031266405CC296F14F363440');
CREATE INDEX ON table_source_multiple_geom USING GIST (geom1); CREATE INDEX ON table_source_multiple_geom USING GIST (geom1);
CREATE INDEX ON table_source_multiple_geom USING GIST (geom2); CREATE INDEX ON table_source_multiple_geom USING GIST (geom2);

View File

@ -1,9 +1,8 @@
use ctor::ctor; use ctor::ctor;
use itertools::Itertools; use itertools::Itertools;
use log::info; use log::info;
use martin::pg::function_source::get_function_sources; use martin::pg::{get_function_sources, Schemas};
use martin::source::Xyz; use martin::Xyz;
use martin::utils::Schemas;
#[path = "utils.rs"] #[path = "utils.rs"]
mod utils; mod utils;
@ -31,6 +30,12 @@ async fn get_function_sources_ok() {
assert_eq!(source.1.minzoom, None); assert_eq!(source.1.minzoom, None);
assert_eq!(source.1.maxzoom, None); assert_eq!(source.1.maxzoom, None);
assert_eq!(source.1.bounds, None); assert_eq!(source.1.bounds, None);
let source = funcs
.get("function_zxy_query_jsonb")
.expect("function_zxy_query_jsonb not found");
assert_eq!(source.1.schema, "public");
assert_eq!(source.1.function, "function_zxy_query_jsonb");
} }
#[actix_rt::test] #[actix_rt::test]
@ -41,9 +46,9 @@ async fn function_source_tilejson() {
info!("tilejson = {tilejson:#?}"); info!("tilejson = {tilejson:#?}");
assert_eq!(tilejson.tilejson, "2.2.0"); assert_eq!(tilejson.tilejson, "2.2.0");
assert_eq!(tilejson.version, some_str("1.0.0")); assert_eq!(tilejson.version, some("1.0.0"));
assert_eq!(tilejson.name, some_str("public.function_zxy_query")); assert_eq!(tilejson.name, some("public.function_zxy_query"));
assert_eq!(tilejson.scheme, some_str("xyz")); assert_eq!(tilejson.scheme, some("xyz"));
assert_eq!(tilejson.minzoom, Some(0)); assert_eq!(tilejson.minzoom, Some(0));
assert_eq!(tilejson.maxzoom, Some(30)); assert_eq!(tilejson.maxzoom, Some(30));
assert!(tilejson.bounds.is_some()); assert!(tilejson.bounds.is_some());
@ -58,7 +63,13 @@ async fn function_source_tile() {
.get_tile(&Xyz { z: 0, x: 0, y: 0 }, &None) .get_tile(&Xyz { z: 0, x: 0, y: 0 }, &None)
.await .await
.unwrap(); .unwrap();
assert!(!tile.is_empty());
let src = source(&mock, "function_zxy_query_jsonb");
let tile = src
.get_tile(&Xyz { z: 0, x: 0, y: 0 }, &None)
.await
.unwrap();
assert!(!tile.is_empty()); assert!(!tile.is_empty());
} }

View File

@ -2,9 +2,8 @@ use actix_http::Request;
use actix_web::http::StatusCode; use actix_web::http::StatusCode;
use actix_web::test::{call_and_read_body_json, call_service, read_body, TestRequest}; use actix_web::test::{call_and_read_body_json, call_service, read_body, TestRequest};
use ctor::ctor; use ctor::ctor;
use martin::pg::config_function::FunctionInfo; use martin::pg::{FunctionInfo, TableInfo};
use martin::pg::config_table::TableInfo; use martin::srv::IndexEntry;
use martin::srv::server::IndexEntry;
use tilejson::{Bounds, TileJSON}; use tilejson::{Bounds, TileJSON};
#[path = "utils.rs"] #[path = "utils.rs"]
@ -24,7 +23,7 @@ macro_rules! create_app {
::actix_web::test::init_service( ::actix_web::test::init_service(
::actix_web::App::new() ::actix_web::App::new()
.app_data(state) .app_data(state)
.configure(::martin::srv::server::router), .configure(::martin::srv::router),
) )
.await .await
}}; }};
@ -49,6 +48,9 @@ async fn get_catalog_ok() {
let expected = "function_zxy_query"; let expected = "function_zxy_query";
assert_eq!(sources.iter().filter(|v| v.id == expected).count(), 1); assert_eq!(sources.iter().filter(|v| v.id == expected).count(), 1);
let expected = "function_zxy_query_jsonb";
assert_eq!(sources.iter().filter(|v| v.id == expected).count(), 1);
} }
#[actix_rt::test] #[actix_rt::test]
@ -229,6 +231,9 @@ async fn get_function_tiles() {
let req = test_get("/function_zxy_query/6/38/20"); let req = test_get("/function_zxy_query/6/38/20");
assert!(call_service(&app, req).await.status().is_success()); assert!(call_service(&app, req).await.status().is_success());
let req = test_get("/function_zxy_query_jsonb/6/38/20");
assert!(call_service(&app, req).await.status().is_success());
let req = test_get("/function_zxy_row/6/38/20"); let req = test_get("/function_zxy_row/6/38/20");
assert!(call_service(&app, req).await.status().is_success()); assert!(call_service(&app, req).await.status().is_success());
@ -355,6 +360,10 @@ async fn get_function_source_ok() {
let response = call_service(&app, req).await; let response = call_service(&app, req).await;
assert!(response.status().is_success()); assert!(response.status().is_success());
let req = test_get("/function_zxy_query_jsonb");
let response = call_service(&app, req).await;
assert!(response.status().is_success());
let req = test_get("/function_zxy_query_test"); let req = test_get("/function_zxy_query_test");
let response = call_service(&app, req).await; let response = call_service(&app, req).await;
assert!(response.status().is_success()); assert!(response.status().is_success());
@ -380,6 +389,19 @@ async fn get_function_source_ok() {
result.tiles, result.tiles,
&["http://localhost:8080/tiles/function_zxy_query/{z}/{x}/{y}?token=martin"] &["http://localhost:8080/tiles/function_zxy_query/{z}/{x}/{y}?token=martin"]
); );
let req = TestRequest::get()
.uri("/function_zxy_query_jsonb?token=martin")
.insert_header((
"x-rewrite-url",
"/tiles/function_zxy_query_jsonb?token=martin",
))
.to_request();
let result: TileJSON = call_and_read_body_json(&app, req).await;
assert_eq!(
result.tiles,
&["http://localhost:8080/tiles/function_zxy_query_jsonb/{z}/{x}/{y}?token=martin"]
);
} }
#[actix_rt::test] #[actix_rt::test]
@ -484,12 +506,12 @@ async fn tables_feature_id() {
..default.clone() ..default.clone()
}; };
let id_only = TableInfo { let id_only = TableInfo {
id_column: some_str("giD"), id_column: some("giD"),
properties: props(&[("TABLE", "text")]), properties: props(&[("TABLE", "text")]),
..default.clone() ..default.clone()
}; };
let id_and_prop = TableInfo { let id_and_prop = TableInfo {
id_column: some_str("giD"), id_column: some("giD"),
properties: props(&[("giD", "int4"), ("TABLE", "text")]), properties: props(&[("giD", "int4"), ("TABLE", "text")]),
..default.clone() ..default.clone()
}; };
@ -517,11 +539,11 @@ async fn tables_feature_id() {
// }); // });
let src = table(&mock, "id_only"); let src = table(&mock, "id_only");
assert_eq!(src.id_column, some_str("giD")); assert_eq!(src.id_column, some("giD"));
assert_eq!(src.properties.len(), 1); assert_eq!(src.properties.len(), 1);
let src = table(&mock, "id_and_prop"); let src = table(&mock, "id_and_prop");
assert_eq!(src.id_column, some_str("giD")); assert_eq!(src.id_column, some("giD"));
assert_eq!(src.properties.len(), 2); assert_eq!(src.properties.len(), 2);
let src = table(&mock, "prop_only"); let src = table(&mock, "prop_only");

View File

@ -1,9 +1,10 @@
use std::collections::HashMap;
use ctor::ctor; use ctor::ctor;
use itertools::Itertools; use itertools::Itertools;
use log::info; use log::info;
use martin::source::Xyz; use martin::pg::Schemas;
use martin::utils::Schemas; use martin::Xyz;
use std::collections::HashMap;
#[path = "utils.rs"] #[path = "utils.rs"]
mod utils; mod utils;
@ -32,7 +33,7 @@ async fn table_source() {
assert_eq!(source.extent, Some(4096)); assert_eq!(source.extent, Some(4096));
assert_eq!(source.buffer, Some(64)); assert_eq!(source.buffer, Some(64));
assert_eq!(source.clip_geom, Some(true)); assert_eq!(source.clip_geom, Some(true));
assert_eq!(source.geometry_type, some_str("GEOMETRY")); assert_eq!(source.geometry_type, some("GEOMETRY"));
let mut properties = HashMap::new(); let mut properties = HashMap::new();
properties.insert("gid".to_owned(), "int4".to_owned()); properties.insert("gid".to_owned(), "int4".to_owned());
@ -47,9 +48,9 @@ async fn tables_tilejson_ok() {
info!("tilejson = {tilejson:#?}"); info!("tilejson = {tilejson:#?}");
assert_eq!(tilejson.tilejson, "2.2.0"); assert_eq!(tilejson.tilejson, "2.2.0");
assert_eq!(tilejson.version, some_str("1.0.0")); assert_eq!(tilejson.version, some("1.0.0"));
assert_eq!(tilejson.name, some_str("public.table_source.geom")); assert_eq!(tilejson.name, some("public.table_source.geom"));
assert_eq!(tilejson.scheme, some_str("xyz")); assert_eq!(tilejson.scheme, some("xyz"));
assert_eq!(tilejson.minzoom, Some(0)); assert_eq!(tilejson.minzoom, Some(0));
assert_eq!(tilejson.maxzoom, Some(30)); assert_eq!(tilejson.maxzoom, Some(30));
assert!(tilejson.bounds.is_some()); assert!(tilejson.bounds.is_some());

View File

@ -13,15 +13,15 @@ function wait_for_martin {
# Seems the --retry-all-errors option is not available on older curl versions, but maybe in the future we can just use this: # Seems the --retry-all-errors option is not available on older curl versions, but maybe in the future we can just use this:
# timeout -k 20s 20s curl --retry 10 --retry-all-errors --retry-delay 1 -sS "$MARTIN_URL/health" # timeout -k 20s 20s curl --retry 10 --retry-all-errors --retry-delay 1 -sS "$MARTIN_URL/health"
PROCESS_ID=$1 PROCESS_ID=$1
echo "Waiting for Martin ($PROCESS_ID) to start..." echo "Waiting for Martin ($PROCESS_ID) to start by checking $MARTIN_URL/health to be valid..."
for i in {1..30}; do for i in {1..60}; do
if curl -sSf "$MARTIN_URL/health" 2>/dev/null >/dev/null; then if curl -sSf "$MARTIN_URL/health" 2>/dev/null >/dev/null; then
echo "Martin is up!" echo "Martin is up!"
curl -s "$MARTIN_URL/health" curl -s "$MARTIN_URL/health"
return return
fi fi
if ps -p $PROCESS_ID > /dev/null ; then if ps -p $PROCESS_ID > /dev/null ; then
echo "Martin is not up yet, waiting..." echo "Martin is not up yet, waiting for $MARTIN_URL/health ..."
sleep 1 sleep 1
else else
echo "Martin died!" echo "Martin died!"
@ -79,9 +79,12 @@ fi
echo "------------------------------------------------------------------------------------------------------------------------" echo "------------------------------------------------------------------------------------------------------------------------"
echo "Test auto configured Martin" echo "Test auto configured Martin"
set -x
ARG=(--default-srid 900913) TEST_OUT_DIR="$(dirname "$0")/output/auto"
mkdir -p "$TEST_OUT_DIR"
ARG=(--default-srid 900913 --save-config "$(dirname "$0")/output/generated_config.yaml")
set -x
$MARTIN_BIN "${ARG[@]}" 2>&1 | tee test_log_1.txt & $MARTIN_BIN "${ARG[@]}" 2>&1 | tee test_log_1.txt &
PROCESS_ID=`jobs -p` PROCESS_ID=`jobs -p`
@ -89,53 +92,51 @@ PROCESS_ID=`jobs -p`
trap "kill -9 $PROCESS_ID 2> /dev/null || true" EXIT trap "kill -9 $PROCESS_ID 2> /dev/null || true" EXIT
wait_for_martin $PROCESS_ID wait_for_martin $PROCESS_ID
TEST_OUT_DIR="$(dirname "$0")/output/auto"
mkdir -p "$TEST_OUT_DIR"
>&2 echo "Test catalog" >&2 echo "Test catalog"
$CURL "$MARTIN_URL/catalog" | jq --sort-keys -e | tee "$TEST_OUT_DIR/catalog.json" $CURL "$MARTIN_URL/catalog" | jq --sort-keys -e | tee "$TEST_OUT_DIR/catalog_auto.json"
>&2 echo "Test server response for table source" >&2 echo "***** Test server response for table source *****"
test_pbf tbl_0_0_0 table_source/0/0/0 test_pbf tbl_0_0_0 table_source/0/0/0
test_pbf tbl_6_38_20 table_source/6/38/20 test_pbf tbl_6_57_29 table_source/6/57/29
test_pbf tbl_12_2476_1280 table_source/12/2476/1280 test_pbf tbl_12_3673_1911 table_source/12/3673/1911
test_pbf tbl_13_4952_2560 table_source/13/4952/2560 test_pbf tbl_13_7346_3822 table_source/13/7346/3822
test_pbf tbl_14_9904_5121 table_source/14/9904/5121 test_pbf tbl_14_14692_7645 table_source/14/14692/7645
test_pbf tbl_20_633856_327787 table_source/20/633856/327787 test_pbf tbl_17_117542_61161 table_source/17/117542/61161
test_pbf tbl_21_1267712_655574 table_source/21/1267712/655574 test_pbf tbl_18_235085_122323 table_source/18/235085/122323
>&2 echo "Test server response for composite source" >&2 echo "***** Test server response for composite source *****"
test_pbf cmp_0_0_0 table_source,points1,points2/0/0/0 test_pbf cmp_0_0_0 table_source,points1,points2/0/0/0
test_pbf cmp_6_38_20 table_source,points1,points2/6/38/20 test_pbf cmp_6_57_29 table_source,points1,points2/6/57/29
test_pbf cmp_12_2476_1280 table_source,points1,points2/12/2476/1280 test_pbf cmp_12_3673_1911 table_source,points1,points2/12/3673/1911
test_pbf cmp_13_4952_2560 table_source,points1,points2/13/4952/2560 test_pbf cmp_13_7346_3822 table_source,points1,points2/13/7346/3822
test_pbf cmp_14_9904_5121 table_source,points1,points2/14/9904/5121 test_pbf cmp_14_14692_7645 table_source,points1,points2/14/14692/7645
test_pbf cmp_20_633856_327787 table_source,points1,points2/20/633856/327787 test_pbf cmp_17_117542_61161 table_source,points1,points2/17/117542/61161
test_pbf cmp_21_1267712_655574 table_source,points1,points2/21/1267712/655574 test_pbf cmp_18_235085_122323 table_source,points1,points2/18/235085/122323
>&2 echo "Test server response for function source" >&2 echo "***** Test server response for function source *****"
test_pbf fnc_0_0_0 function_zxy_query/0/0/0 test_pbf fnc_0_0_0 function_zxy_query/0/0/0
test_pbf fnc_6_38_20 function_zxy_query/6/38/20 test_pbf fnc_6_57_29 function_zxy_query/6/57/29
test_pbf fnc_12_2476_1280 function_zxy_query/12/2476/1280 test_pbf fnc_12_3673_1911 function_zxy_query/12/3673/1911
test_pbf fnc_13_4952_2560 function_zxy_query/13/4952/2560 test_pbf fnc_13_7346_3822 function_zxy_query/13/7346/3822
test_pbf fnc_14_9904_5121 function_zxy_query/14/9904/5121 test_pbf fnc_14_14692_7645 function_zxy_query/14/14692/7645
test_pbf fnc_20_633856_327787 function_zxy_query/20/633856/327787 test_pbf fnc_17_117542_61161 function_zxy_query/17/117542/61161
test_pbf fnc_21_1267712_655574 function_zxy_query/21/1267712/655574 test_pbf fnc_18_235085_122323 function_zxy_query/18/235085/122323
test_pbf fnc_0_0_0_token function_zxy_query_test/0/0/0?token=martin test_pbf fnc_0_0_0_token function_zxy_query_test/0/0/0?token=martin
test_pbf fnc_b_6_38_20 function_zxy_query_jsonb/6/57/29
>&2 echo "Test server response for different function call types" >&2 echo "***** Test server response for different function call types *****"
test_pbf fnc_zoom_xy_6_38_20 function_zoom_xy/6/38/20 test_pbf fnc_zoom_xy_6_57_29 function_zoom_xy/6/57/29
test_pbf fnc_zxy_6_38_20 function_zxy/6/38/20 test_pbf fnc_zxy_6_57_29 function_zxy/6/57/29
test_pbf fnc_zxy2_6_38_20 function_zxy2/6/38/20 test_pbf fnc_zxy2_6_57_29 function_zxy2/6/57/29
test_pbf fnc_zxy_query_6_38_20 function_zxy_query/6/38/20 test_pbf fnc_zxy_query_6_57_29 function_zxy_query/6/57/29
test_pbf fnc_zxy_row_6_38_20 function_zxy_row/6/38/20 test_pbf fnc_zxy_row_6_57_29 function_zxy_row/6/57/29
test_pbf fnc_zxy_row2_6_38_20 function_Mixed_Name/6/38/20 test_pbf fnc_zxy_row2_6_57_29 function_Mixed_Name/6/57/29
test_pbf fnc_zxy_row_key_6_38_20 function_zxy_row_key/6/38/20 test_pbf fnc_zxy_row_key_6_57_29 function_zxy_row_key/6/57/29
>&2 echo "Test server response for table source with different SRID" >&2 echo "***** Test server response for table source with different SRID *****"
test_pbf points3857_srid_0_0_0 points3857/0/0/0 test_pbf points3857_srid_0_0_0 points3857/0/0/0
>&2 echo "Test server response for table source with empty SRID" >&2 echo "***** Test server response for table source with empty SRID *****"
echo "IGNORING: This test is currently failing, and has been failing for a while" echo "IGNORING: This test is currently failing, and has been failing for a while"
echo "IGNORING: " test_pbf points_empty_srid_0_0_0 points_empty_srid/0/0/0 echo "IGNORING: " test_pbf points_empty_srid_0_0_0 points_empty_srid/0/0/0
@ -145,20 +146,19 @@ grep -e ' ERROR ' -e ' WARN ' test_log_1.txt && exit 1
echo "------------------------------------------------------------------------------------------------------------------------" echo "------------------------------------------------------------------------------------------------------------------------"
echo "Test pre-configured Martin" echo "Test pre-configured Martin"
set -x TEST_OUT_DIR="$(dirname "$0")/output/configured"
mkdir -p "$TEST_OUT_DIR"
ARG=(--config tests/config.yaml) ARG=(--config tests/config.yaml --save-config "$(dirname "$0")/output/given_config.yaml" -W 1)
set -x
$MARTIN_BIN "${ARG[@]}" 2>&1 | tee test_log_2.txt & $MARTIN_BIN "${ARG[@]}" 2>&1 | tee test_log_2.txt &
PROCESS_ID=`jobs -p` PROCESS_ID=`jobs -p`
{ set +x; } 2> /dev/null { set +x; } 2> /dev/null
trap "kill -9 $PROCESS_ID 2> /dev/null || true" EXIT trap "kill -9 $PROCESS_ID 2> /dev/null || true" EXIT
wait_for_martin $PROCESS_ID wait_for_martin $PROCESS_ID
TEST_OUT_DIR="$(dirname "$0")/output/configured"
mkdir -p "$TEST_OUT_DIR"
>&2 echo "Test catalog" >&2 echo "Test catalog"
$CURL "$MARTIN_URL/catalog" | jq --sort-keys -e | tee "$TEST_OUT_DIR/catalog.json" $CURL "$MARTIN_URL/catalog" | jq --sort-keys -e | tee "$TEST_OUT_DIR/catalog_cfg.json"
test_pbf tbl_0_0_0 table_source/0/0/0 test_pbf tbl_0_0_0 table_source/0/0/0
test_pbf cmp_0_0_0 points1,points2/0/0/0 test_pbf cmp_0_0_0 points1,points2/0/0/0

View File

@ -2,17 +2,19 @@
#![allow(clippy::redundant_clone)] #![allow(clippy::redundant_clone)]
#![allow(clippy::unused_async)] #![allow(clippy::unused_async)]
use std::collections::HashMap;
use actix_web::web::Data; use actix_web::web::Data;
use log::info; use log::info;
use martin::pg::config::PgConfig; pub use martin::args::Env;
use martin::pg::config_function::FunctionInfo; use martin::pg::{FunctionInfo, PgConfig, Pool, TableInfo};
use martin::pg::config_table::TableInfo; use martin::srv::AppState;
use martin::pg::pool::Pool; use martin::{IdResolver, Source, Sources};
use martin::source::{IdResolver, Source};
use martin::srv::server::{AppState, Sources};
use std::collections::HashMap;
use std::env;
use tilejson::Bounds; use tilejson::Bounds;
#[path = "../src/utils/test_utils.rs"]
mod test_utils;
#[allow(clippy::wildcard_imports)]
pub use test_utils::*;
// //
// This file is used by many tests and benchmarks using the #[path] attribute. // This file is used by many tests and benchmarks using the #[path] attribute.
@ -27,10 +29,12 @@ pub async fn mock_config(
tables: Option<Vec<(&'static str, TableInfo)>>, tables: Option<Vec<(&'static str, TableInfo)>>,
default_srid: Option<i32>, default_srid: Option<i32>,
) -> PgConfig { ) -> PgConfig {
let connection_string: String = env::var("DATABASE_URL").unwrap(); let Ok(db_url) = std::env::var("DATABASE_URL") else {
info!("Connecting to {connection_string}"); panic!("DATABASE_URL env var is not set. Unable to do integration tests");
let config = PgConfig { };
connection_string: Some(connection_string), info!("Connecting to {db_url}");
let mut config = PgConfig {
connection_string: Some(db_url),
default_srid, default_srid,
tables: tables.map(|s| { tables: tables.map(|s| {
s.iter() s.iter()
@ -44,7 +48,8 @@ pub async fn mock_config(
}), }),
..Default::default() ..Default::default()
}; };
config.finalize().expect("Unable to finalize config") config.finalize().expect("Unable to finalize config");
config
} }
#[allow(dead_code)] #[allow(dead_code)]
@ -138,6 +143,14 @@ pub fn mock_func_config_map() -> HashMap<&'static str, FunctionInfo> {
..default.clone() ..default.clone()
}, },
), ),
(
"function_zxy_query_jsonb",
FunctionInfo {
schema: "public".to_string(),
function: "function_zxy_query_jsonb".to_string(),
..default.clone()
},
),
( (
"function_zxy_row", "function_zxy_row",
FunctionInfo { FunctionInfo {
@ -197,7 +210,7 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
schema: "public".to_string(), schema: "public".to_string(),
table: "points1".to_string(), table: "points1".to_string(),
geometry_column: "geom".to_string(), geometry_column: "geom".to_string(),
geometry_type: some_str("POINT"), geometry_type: some("POINT"),
properties: props(&[("gid", "int4")]), properties: props(&[("gid", "int4")]),
..default.clone() ..default.clone()
}, },
@ -208,7 +221,7 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
schema: "public".to_string(), schema: "public".to_string(),
table: "points2".to_string(), table: "points2".to_string(),
geometry_column: "geom".to_string(), geometry_column: "geom".to_string(),
geometry_type: some_str("POINT"), geometry_type: some("POINT"),
properties: props(&[("gid", "int4")]), properties: props(&[("gid", "int4")]),
..default.clone() ..default.clone()
}, },
@ -220,8 +233,8 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
schema: "MIXEDCASE".to_string(), schema: "MIXEDCASE".to_string(),
table: "mixPoints".to_string(), table: "mixPoints".to_string(),
geometry_column: "geoM".to_string(), geometry_column: "geoM".to_string(),
geometry_type: some_str("POINT"), geometry_type: some("POINT"),
id_column: some_str("giD"), id_column: some("giD"),
properties: props(&[("tAble", "text")]), properties: props(&[("tAble", "text")]),
..default.clone() ..default.clone()
}, },
@ -233,7 +246,7 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
table: "points3857".to_string(), table: "points3857".to_string(),
srid: 3857, srid: 3857,
geometry_column: "geom".to_string(), geometry_column: "geom".to_string(),
geometry_type: some_str("POINT"), geometry_type: some("POINT"),
properties: props(&[("gid", "int4")]), properties: props(&[("gid", "int4")]),
..default.clone() ..default.clone()
}, },
@ -245,7 +258,7 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
table: "points_empty_srid".to_string(), table: "points_empty_srid".to_string(),
srid: 900_973, srid: 900_973,
geometry_column: "geom".to_string(), geometry_column: "geom".to_string(),
geometry_type: some_str("GEOMETRY"), geometry_type: some("GEOMETRY"),
properties: props(&[("gid", "int4")]), properties: props(&[("gid", "int4")]),
..default.clone() ..default.clone()
}, },
@ -256,7 +269,7 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
schema: "public".to_string(), schema: "public".to_string(),
table: "table_source".to_string(), table: "table_source".to_string(),
geometry_column: "geom".to_string(), geometry_column: "geom".to_string(),
geometry_type: some_str("GEOMETRY"), geometry_type: some("GEOMETRY"),
properties: props(&[("gid", "int4")]), properties: props(&[("gid", "int4")]),
..default.clone() ..default.clone()
}, },
@ -267,7 +280,7 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
schema: "public".to_string(), schema: "public".to_string(),
table: "table_source_multiple_geom".to_string(), table: "table_source_multiple_geom".to_string(),
geometry_column: "geom1".to_string(), geometry_column: "geom1".to_string(),
geometry_type: some_str("POINT"), geometry_type: some("POINT"),
properties: props(&[("geom2", "geometry"), ("gid", "int4")]), properties: props(&[("geom2", "geometry"), ("gid", "int4")]),
..default.clone() ..default.clone()
}, },
@ -278,7 +291,7 @@ pub fn mock_table_config_map() -> HashMap<&'static str, TableInfo> {
schema: "public".to_string(), schema: "public".to_string(),
table: "table_source_multiple_geom".to_string(), table: "table_source_multiple_geom".to_string(),
geometry_column: "geom2".to_string(), geometry_column: "geom2".to_string(),
geometry_type: some_str("POINT"), geometry_type: some("POINT"),
properties: props(&[("gid", "int4"), ("geom1", "geometry")]), properties: props(&[("gid", "int4"), ("geom1", "geometry")]),
..default.clone() ..default.clone()
}, },
@ -309,9 +322,3 @@ pub fn source<'a>(mock: &'a MockSource, name: &str) -> &'a dyn Source {
let (sources, _) = mock; let (sources, _) = mock;
sources.get(name).unwrap().as_ref() sources.get(name).unwrap().as_ref()
} }
#[allow(dead_code, clippy::unnecessary_wraps)]
#[must_use]
pub fn some_str(s: &str) -> Option<String> {
Some(s.to_string())
}