mirror of
https://github.com/digital-asset/daml.git
synced 2024-11-10 00:35:25 +03:00
7d665d6163
It looks like GCP doesn't like not having a "page suffix" set, so it sets a default. Except somehow Terraform doesn't know it's a default value, so when trying to plan without the (optional) website value set, Terraform will always find that the deployed state has changed. With this change, we set it to a value that doesn't exist and won't work, but at least Terraform will see that the deployed state matches the configured one. Note: this PR is a bit special as far as "changes" go as there will be nothing to apply: applying current master tries to get rid of this website.main_page_suffix value, but it's back on the next run. With this patch, `terraform plan` declares "nothing to apply", so this PR itself won't (need to) be applied. CHANGELOG_BEGIN CHANGELOG_END |
||
---|---|---|
.. | ||
google_compute.tf | ||
google_storage.tf | ||
outputs.tf | ||
README.md | ||
variables.tf |
A Google Storage Bucket + CDN configuration
This modules contains essentially two things:
- A GCS bucket to store objects into
- A load-balancer connected to it
It also makes a few assumptions:
- A service account will be created to write into the bucket
- All objects are meant to be publicly-readable
Module config
> terraform-docs md .
Inputs
Name | Description | Type | Default | Required |
---|---|---|---|---|
cache_retention_days | The number of days to keep the objects around | string | n/a | yes |
labels | Labels to apply on all the resources | map | <map> |
no |
name | Name prefix for all the resources | string | n/a | yes |
project | GCP project name | string | n/a | yes |
region | GCP region in which to create the resources | string | n/a | yes |
ssl_certificate | A reference to the SSL certificate, google managed or not | string | n/a | yes |
Outputs
Name | Description |
---|---|
bucket_name | Name of the GCS bucket that will receive the objects. |
external_ip | The external IP assigned to the global fowarding rule. |