analytics/lib/plausible_web/email.ex

457 lines
13 KiB
Elixir
Raw Normal View History

2019-09-02 14:29:19 +03:00
defmodule PlausibleWeb.Email do
use Plausible
2019-09-02 14:29:19 +03:00
use Bamboo.Phoenix, view: PlausibleWeb.EmailView
import Bamboo.PostmarkHelper
Support for docker based self-hosting (#64) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details
2020-05-26 16:09:34 +03:00
def mailer_email_from do
Application.get_env(:plausible, :mailer_email)
end
def activation_email(user, code) do
priority_email()
|> to(user)
|> tag("activation-email")
|> subject("#{code} is your Plausible email verification code")
|> render("activation_email.html", user: user, code: code)
end
2019-09-02 14:29:19 +03:00
def welcome_email(user) do
2020-05-11 14:27:20 +03:00
base_email()
2019-09-02 14:29:19 +03:00
|> to(user)
|> tag("welcome-email")
|> subject("Welcome to Plausible")
|> render("welcome_email.html", user: user)
2019-09-02 14:29:19 +03:00
end
def create_site_email(user) do
2020-05-11 14:27:20 +03:00
base_email()
|> to(user)
|> tag("create-site-email")
|> subject("Your Plausible setup: Add your website details")
|> render("create_site_email.html", user: user)
end
def site_setup_help(user, site) do
2020-05-11 14:27:20 +03:00
base_email()
2019-09-02 14:29:19 +03:00
|> to(user)
|> tag("help-email")
|> subject("Your Plausible setup: Waiting for the first page views")
|> render("site_setup_help_email.html",
user: user,
site: site
)
end
def site_setup_success(user, site) do
2020-05-11 14:27:20 +03:00
base_email()
|> to(user)
|> tag("setup-success-email")
|> subject("Plausible is now tracking your website stats")
|> render("site_setup_success_email.html",
user: user,
site: site
)
end
def check_stats_email(user) do
2020-05-11 14:27:20 +03:00
base_email()
|> to(user)
|> tag("check-stats-email")
|> subject("Check your Plausible website stats")
|> render("check_stats_email.html", user: user)
2019-09-02 14:29:19 +03:00
end
def password_reset_email(email, reset_link) do
priority_email(%{layout: nil})
2019-09-02 14:29:19 +03:00
|> to(email)
|> tag("password-reset-email")
|> subject("Plausible password reset")
|> render("password_reset_email.html", reset_link: reset_link)
end
def two_factor_enabled_email(user) do
priority_email()
|> to(user)
|> tag("two-factor-enabled-email")
Implement UI for 2FA setup and verification (#3541) * Add 2FA actions to `AuthController` * Hook up new `AuthController` actions to router * Add `qr_code` to project dependencies * Implement generic `qr_code` component rendering SVG QR code from text * Implement enabled and disabled 2FA setting state in user settings view * Implement view for initiating 2FA setup * Implement view for verifying 2FA setup * Implement view for rendering generated 2FA recovery codes * Implement view for verifying 2FA code * Implement view for verifying 2FA recovery code * Improve `input_with_clipboard` component * Improve view for initiating 2FA setup * Improve verify 2FA setup view * Implement `verify_2fa_input` component * Improve view for verifying 2FA setup * Improve view rendering generated 2FA recovery codes * Use `verify_2fa_input` component in verify 2FA view * Do not render PA contact on self-hosted instances * Improve flash message phrasing on generated recovery codes * Add byline with a warning to disable 2FA modal * Extract modal to component and move 2FA components to dedicated module * First pass on loading state for "generate new codes" * Adjust modal button logic * Fix button in verify_2fa_input component * Use button component in activate view * Implement wait states for recovery code related actions properly * Apply rate limiting to 2FA verification * Log failed 2FA code input attempts * Add ability to trust device and skip 2FA for 30 days * Improve styling in dark mode * Fix waiting state under Chrome and Safari * Delete trust cookie when disabling 2FA * Put 2FA behind a feature flag * Extract 2FA cookie deletion * ff fixup * Improve session management during 2FA login * Extract part of 2FA controller logic to a separate module and clean up a bit * Clear 2FA user session when rate limit hit * Add id to form in verify 2FA setup view * Add controller tests for 2FA actions and login action * Update CHANGELOG.md * Use `full_build?()` instead of `@is_selfhost` removed after rebase * Update `Auth.TOTP` moduledoc * Add TOTP token management and make `TOTP.enable` more test-friendly * Use TOTP token for device trust feature * Use zero-deps `eqrcode` instead of deps-heavy `qr_code` * Improve flash messages copy Co-authored-by: hq1 <hq@mtod.org> * Make one more copy improvement Co-authored-by: hq1 <hq@mtod.org> * Fix copy in remaining spots * Change redirect after login to accept URLs from #3560 (h/t @aerosol) * Add tests checking handling login_dest on login and 2FA verification * Fix regression in email activation form submit button behavior * Rename `PlausibleWeb.TwoFactor` -> `PlausibleWeb.TwoFactor.Session` * Move `qr_code` component under `Components.TwoFactor` * Set domain and secure options for new cookies --------- Co-authored-by: hq1 <hq@mtod.org>
2023-12-06 14:01:19 +03:00
|> subject("Plausible Two-Factor Authentication enabled")
|> render("two_factor_enabled_email.html", user: user)
end
def two_factor_disabled_email(user) do
priority_email()
|> to(user)
|> tag("two-factor-disabled-email")
Implement UI for 2FA setup and verification (#3541) * Add 2FA actions to `AuthController` * Hook up new `AuthController` actions to router * Add `qr_code` to project dependencies * Implement generic `qr_code` component rendering SVG QR code from text * Implement enabled and disabled 2FA setting state in user settings view * Implement view for initiating 2FA setup * Implement view for verifying 2FA setup * Implement view for rendering generated 2FA recovery codes * Implement view for verifying 2FA code * Implement view for verifying 2FA recovery code * Improve `input_with_clipboard` component * Improve view for initiating 2FA setup * Improve verify 2FA setup view * Implement `verify_2fa_input` component * Improve view for verifying 2FA setup * Improve view rendering generated 2FA recovery codes * Use `verify_2fa_input` component in verify 2FA view * Do not render PA contact on self-hosted instances * Improve flash message phrasing on generated recovery codes * Add byline with a warning to disable 2FA modal * Extract modal to component and move 2FA components to dedicated module * First pass on loading state for "generate new codes" * Adjust modal button logic * Fix button in verify_2fa_input component * Use button component in activate view * Implement wait states for recovery code related actions properly * Apply rate limiting to 2FA verification * Log failed 2FA code input attempts * Add ability to trust device and skip 2FA for 30 days * Improve styling in dark mode * Fix waiting state under Chrome and Safari * Delete trust cookie when disabling 2FA * Put 2FA behind a feature flag * Extract 2FA cookie deletion * ff fixup * Improve session management during 2FA login * Extract part of 2FA controller logic to a separate module and clean up a bit * Clear 2FA user session when rate limit hit * Add id to form in verify 2FA setup view * Add controller tests for 2FA actions and login action * Update CHANGELOG.md * Use `full_build?()` instead of `@is_selfhost` removed after rebase * Update `Auth.TOTP` moduledoc * Add TOTP token management and make `TOTP.enable` more test-friendly * Use TOTP token for device trust feature * Use zero-deps `eqrcode` instead of deps-heavy `qr_code` * Improve flash messages copy Co-authored-by: hq1 <hq@mtod.org> * Make one more copy improvement Co-authored-by: hq1 <hq@mtod.org> * Fix copy in remaining spots * Change redirect after login to accept URLs from #3560 (h/t @aerosol) * Add tests checking handling login_dest on login and 2FA verification * Fix regression in email activation form submit button behavior * Rename `PlausibleWeb.TwoFactor` -> `PlausibleWeb.TwoFactor.Session` * Move `qr_code` component under `Components.TwoFactor` * Set domain and secure options for new cookies --------- Co-authored-by: hq1 <hq@mtod.org>
2023-12-06 14:01:19 +03:00
|> subject("Plausible Two-Factor Authentication disabled")
|> render("two_factor_disabled_email.html", user: user)
end
def trial_one_week_reminder(user) do
2020-05-11 14:27:20 +03:00
base_email()
2019-09-02 14:29:19 +03:00
|> to(user)
|> tag("trial-one-week-reminder")
2020-03-24 16:29:44 +03:00
|> subject("Your Plausible trial expires next week")
|> render("trial_one_week_reminder.html", user: user)
2019-09-02 14:29:19 +03:00
end
def trial_upgrade_email(user, day, usage) do
suggested_plan = Plausible.Billing.Plans.suggest(user, usage.total)
2020-05-11 14:27:20 +03:00
base_email()
2019-09-02 14:29:19 +03:00
|> to(user)
|> tag("trial-upgrade-email")
|> subject("Your Plausible trial ends #{day}")
|> render("trial_upgrade_email.html",
user: user,
day: day,
custom_events: usage.custom_events,
usage: usage.total,
suggested_plan: suggested_plan
)
2019-09-02 14:29:19 +03:00
end
def trial_over_email(user) do
2020-05-11 14:27:20 +03:00
base_email()
2019-09-02 14:29:19 +03:00
|> to(user)
|> tag("trial-over-email")
|> subject("Your Plausible trial has ended")
|> render("trial_over_email.html",
user: user,
extra_offset: Plausible.Auth.User.trial_accept_traffic_until_offset_days()
)
2019-09-02 14:29:19 +03:00
end
def stats_report(email, assigns) do
base_email(%{layout: nil})
2019-09-05 19:11:07 +03:00
|> to(email)
|> tag("#{assigns.type}-report")
|> subject("#{assigns.name} report for #{assigns.site.domain}")
|> html_body(PlausibleWeb.MJML.StatsReport.render(assigns))
2019-09-05 19:11:07 +03:00
end
2020-05-11 14:27:20 +03:00
2020-12-11 12:50:44 +03:00
def spike_notification(email, site, current_visitors, sources, dashboard_link) do
base_email()
|> to(email)
|> tag("spike-notification")
|> subject("Traffic Spike on #{site.domain}")
|> render("spike_notification.html", %{
site: site,
current_visitors: current_visitors,
sources: sources,
link: dashboard_link
})
end
def over_limit_email(user, usage, suggested_plan) do
priority_email()
2021-03-01 11:11:49 +03:00
|> to(user)
|> tag("over-limit")
|> subject("[Action required] You have outgrown your Plausible subscription tier")
|> render("over_limit.html", %{
user: user,
usage: usage,
suggested_plan: suggested_plan
})
end
def enterprise_over_limit_internal_email(user, pageview_usage, site_usage, site_allowance) do
base_email(%{layout: nil})
2021-10-20 17:49:11 +03:00
|> to("enterprise@plausible.io")
|> tag("enterprise-over-limit")
|> subject("#{user.email} has outgrown their enterprise plan")
|> render("enterprise_over_limit_internal.html", %{
2021-10-20 17:49:11 +03:00
user: user,
pageview_usage: pageview_usage,
site_usage: site_usage,
site_allowance: site_allowance
2021-10-20 17:49:11 +03:00
})
end
def dashboard_locked(user, usage, suggested_plan) do
priority_email()
|> to(user)
|> tag("dashboard-locked")
|> subject("[Action required] Your Plausible dashboard is now locked")
|> render("dashboard_locked.html", %{
user: user,
usage: usage,
suggested_plan: suggested_plan
})
end
def yearly_renewal_notification(user) do
date = Timex.format!(user.subscription.next_bill_date, "{Mfull} {D}, {YYYY}")
priority_email()
|> to(user)
|> tag("yearly-renewal")
|> subject("Your Plausible subscription is up for renewal")
|> render("yearly_renewal_notification.html", %{
user: user,
date: date,
2021-05-13 12:42:01 +03:00
next_bill_amount: user.subscription.next_bill_amount,
currency: user.subscription.currency_code
})
end
def yearly_expiration_notification(user) do
next_bill_date = Timex.format!(user.subscription.next_bill_date, "{Mfull} {D}, {YYYY}")
accept_traffic_until =
user
|> Plausible.Users.accept_traffic_until()
|> Timex.format!("{Mfull} {D}, {YYYY}")
priority_email()
|> to(user)
|> tag("yearly-expiration")
|> subject("Your Plausible subscription is about to expire")
|> render("yearly_expiration_notification.html", %{
user: user,
next_bill_date: next_bill_date,
accept_traffic_until: accept_traffic_until
})
end
def cancellation_email(user) do
base_email()
|> to(user.email)
|> tag("cancelled-email")
|> subject("Mind sharing your thoughts on Plausible?")
|> render("cancellation_email.html", user: user)
end
def new_user_invitation(invitation) do
priority_email()
|> to(invitation.email)
|> tag("new-user-invitation")
|> subject("[Plausible Analytics] You've been invited to #{invitation.site.domain}")
|> render("new_user_invitation.html",
invitation: invitation
)
end
def existing_user_invitation(invitation) do
priority_email()
|> to(invitation.email)
|> tag("existing-user-invitation")
|> subject("[Plausible Analytics] You've been invited to #{invitation.site.domain}")
|> render("existing_user_invitation.html",
invitation: invitation
)
end
def ownership_transfer_request(invitation, new_owner_account) do
priority_email()
|> to(invitation.email)
|> tag("ownership-transfer-request")
|> subject("[Plausible Analytics] Request to transfer ownership of #{invitation.site.domain}")
|> render("ownership_transfer_request.html",
invitation: invitation,
new_owner_account: new_owner_account
)
end
def invitation_accepted(invitation) do
priority_email()
|> to(invitation.inviter.email)
|> tag("invitation-accepted")
|> subject(
"[Plausible Analytics] #{invitation.email} accepted your invitation to #{invitation.site.domain}"
)
|> render("invitation_accepted.html",
user: invitation.inviter,
invitation: invitation
)
end
def invitation_rejected(invitation) do
priority_email()
|> to(invitation.inviter.email)
|> tag("invitation-rejected")
|> subject(
"[Plausible Analytics] #{invitation.email} rejected your invitation to #{invitation.site.domain}"
)
|> render("invitation_rejected.html",
user: invitation.inviter,
invitation: invitation
)
end
def ownership_transfer_accepted(invitation) do
priority_email()
|> to(invitation.inviter.email)
|> tag("ownership-transfer-accepted")
|> subject(
"[Plausible Analytics] #{invitation.email} accepted the ownership transfer of #{invitation.site.domain}"
)
|> render("ownership_transfer_accepted.html",
user: invitation.inviter,
invitation: invitation
)
end
def ownership_transfer_rejected(invitation) do
priority_email()
|> to(invitation.inviter.email)
|> tag("ownership-transfer-rejected")
|> subject(
"[Plausible Analytics] #{invitation.email} rejected the ownership transfer of #{invitation.site.domain}"
)
|> render("ownership_transfer_rejected.html",
user: invitation.inviter,
invitation: invitation
)
end
def site_member_removed(membership) do
priority_email()
|> to(membership.user.email)
|> tag("site-member-removed")
|> subject("[Plausible Analytics] Your access to #{membership.site.domain} has been revoked")
|> render("site_member_removed.html",
user: membership.user,
membership: membership
)
end
Add multiple imports per site (#3724) * Clean up references to no longer active `google_analytics_imports` Oban queue * Stub CSV importer * Add SiteImport schema * Rename `Plausible.Imported` module file to match module name * Add `import_id` column to `Imported.*` CH schemas * Implement Importer behavior and manage imports state using new entities * Implement importer callbacks and maintain site.imported_data for UA * Keep imports in sync when forgetting all imports * Scope imported data queries to completed import IDs * Mark newly imported data with respective import ID * Clean up Importer implementation a bit * Test querying legacy and new imported data * Send Oban notifications on import worker failure too * Fix checking for forgettable imports and remove redundant function * Fix UA integration test * Change site import source to atom enum and add source label * Add typespecs and reduce repetition in `Plausible.Imported` * Improve documentation and typespecs * Add test for purging particular import * Switch email notification templates depending on import source * Document running import synchronously * Fix UA importer args parsing and ensure it's covered by tests * Clear `site.stats_start_date` on complete import to force recalculation * Test Oban notifications (h/t @ruslandoga) * Purge stats on import failure right away to reduce a chance of leaving debris behind * Fix typos Co-authored-by: hq1 <hq@mtod.org> * Fix another typo * Refactor fetching earliest import and earliest stats start date * Use `Date.after?` instead of `Timex.after?` * Cache import data in site virtual fields and limit queried imports to 5 * Ensure always current `stats_start_date` is used * Work around broken typespec in Timex * Make `SiteController.forget_imported` action idempotent * Discard irrecoverably failed import tasks * Use macros for site import statuses There's also a fix ensuring only complete imports are considered where relevant - couldn't isolate it as it was in a common hunk * Use `import_id` as worker job uniqueness criterion * Do not load imported stats data in plugins API context --------- Co-authored-by: hq1 <hq@mtod.org>
2024-02-14 11:32:36 +03:00
def import_success(site_import, user) do
import_api = Plausible.Imported.ImportSources.by_name(site_import.source)
label = import_api.label()
priority_email()
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
|> to(user)
|> tag("import-success-email")
Add multiple imports per site (#3724) * Clean up references to no longer active `google_analytics_imports` Oban queue * Stub CSV importer * Add SiteImport schema * Rename `Plausible.Imported` module file to match module name * Add `import_id` column to `Imported.*` CH schemas * Implement Importer behavior and manage imports state using new entities * Implement importer callbacks and maintain site.imported_data for UA * Keep imports in sync when forgetting all imports * Scope imported data queries to completed import IDs * Mark newly imported data with respective import ID * Clean up Importer implementation a bit * Test querying legacy and new imported data * Send Oban notifications on import worker failure too * Fix checking for forgettable imports and remove redundant function * Fix UA integration test * Change site import source to atom enum and add source label * Add typespecs and reduce repetition in `Plausible.Imported` * Improve documentation and typespecs * Add test for purging particular import * Switch email notification templates depending on import source * Document running import synchronously * Fix UA importer args parsing and ensure it's covered by tests * Clear `site.stats_start_date` on complete import to force recalculation * Test Oban notifications (h/t @ruslandoga) * Purge stats on import failure right away to reduce a chance of leaving debris behind * Fix typos Co-authored-by: hq1 <hq@mtod.org> * Fix another typo * Refactor fetching earliest import and earliest stats start date * Use `Date.after?` instead of `Timex.after?` * Cache import data in site virtual fields and limit queried imports to 5 * Ensure always current `stats_start_date` is used * Work around broken typespec in Timex * Make `SiteController.forget_imported` action idempotent * Discard irrecoverably failed import tasks * Use macros for site import statuses There's also a fix ensuring only complete imports are considered where relevant - couldn't isolate it as it was in a common hunk * Use `import_id` as worker job uniqueness criterion * Do not load imported stats data in plugins API context --------- Co-authored-by: hq1 <hq@mtod.org>
2024-02-14 11:32:36 +03:00
|> subject("#{label} data imported for #{site_import.site.domain}")
|> render(import_api.email_template(), %{
site_import: site_import,
label: label,
link: PlausibleWeb.Endpoint.url() <> "/" <> URI.encode_www_form(site_import.site.domain),
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
user: user,
success: true
})
end
Add multiple imports per site (#3724) * Clean up references to no longer active `google_analytics_imports` Oban queue * Stub CSV importer * Add SiteImport schema * Rename `Plausible.Imported` module file to match module name * Add `import_id` column to `Imported.*` CH schemas * Implement Importer behavior and manage imports state using new entities * Implement importer callbacks and maintain site.imported_data for UA * Keep imports in sync when forgetting all imports * Scope imported data queries to completed import IDs * Mark newly imported data with respective import ID * Clean up Importer implementation a bit * Test querying legacy and new imported data * Send Oban notifications on import worker failure too * Fix checking for forgettable imports and remove redundant function * Fix UA integration test * Change site import source to atom enum and add source label * Add typespecs and reduce repetition in `Plausible.Imported` * Improve documentation and typespecs * Add test for purging particular import * Switch email notification templates depending on import source * Document running import synchronously * Fix UA importer args parsing and ensure it's covered by tests * Clear `site.stats_start_date` on complete import to force recalculation * Test Oban notifications (h/t @ruslandoga) * Purge stats on import failure right away to reduce a chance of leaving debris behind * Fix typos Co-authored-by: hq1 <hq@mtod.org> * Fix another typo * Refactor fetching earliest import and earliest stats start date * Use `Date.after?` instead of `Timex.after?` * Cache import data in site virtual fields and limit queried imports to 5 * Ensure always current `stats_start_date` is used * Work around broken typespec in Timex * Make `SiteController.forget_imported` action idempotent * Discard irrecoverably failed import tasks * Use macros for site import statuses There's also a fix ensuring only complete imports are considered where relevant - couldn't isolate it as it was in a common hunk * Use `import_id` as worker job uniqueness criterion * Do not load imported stats data in plugins API context --------- Co-authored-by: hq1 <hq@mtod.org>
2024-02-14 11:32:36 +03:00
def import_failure(site_import, user) do
import_api = Plausible.Imported.ImportSources.by_name(site_import.source)
label = import_api.label()
priority_email()
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
|> to(user)
|> tag("import-failure-email")
Add multiple imports per site (#3724) * Clean up references to no longer active `google_analytics_imports` Oban queue * Stub CSV importer * Add SiteImport schema * Rename `Plausible.Imported` module file to match module name * Add `import_id` column to `Imported.*` CH schemas * Implement Importer behavior and manage imports state using new entities * Implement importer callbacks and maintain site.imported_data for UA * Keep imports in sync when forgetting all imports * Scope imported data queries to completed import IDs * Mark newly imported data with respective import ID * Clean up Importer implementation a bit * Test querying legacy and new imported data * Send Oban notifications on import worker failure too * Fix checking for forgettable imports and remove redundant function * Fix UA integration test * Change site import source to atom enum and add source label * Add typespecs and reduce repetition in `Plausible.Imported` * Improve documentation and typespecs * Add test for purging particular import * Switch email notification templates depending on import source * Document running import synchronously * Fix UA importer args parsing and ensure it's covered by tests * Clear `site.stats_start_date` on complete import to force recalculation * Test Oban notifications (h/t @ruslandoga) * Purge stats on import failure right away to reduce a chance of leaving debris behind * Fix typos Co-authored-by: hq1 <hq@mtod.org> * Fix another typo * Refactor fetching earliest import and earliest stats start date * Use `Date.after?` instead of `Timex.after?` * Cache import data in site virtual fields and limit queried imports to 5 * Ensure always current `stats_start_date` is used * Work around broken typespec in Timex * Make `SiteController.forget_imported` action idempotent * Discard irrecoverably failed import tasks * Use macros for site import statuses There's also a fix ensuring only complete imports are considered where relevant - couldn't isolate it as it was in a common hunk * Use `import_id` as worker job uniqueness criterion * Do not load imported stats data in plugins API context --------- Co-authored-by: hq1 <hq@mtod.org>
2024-02-14 11:32:36 +03:00
|> subject("#{label} import failed for #{site_import.site.domain}")
|> render(import_api.email_template(), %{
site_import: site_import,
label: label,
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
user: user,
success: false
})
end
def export_success(user, site, download_url, expires_at) do
subject =
on_ee do
"Your Plausible Analytics export is now ready for download"
else
"Your export is now ready for download"
end
expires_in =
if expires_at do
Timex.Format.DateTime.Formatters.Relative.format!(
expires_at,
"{relative}"
)
end
priority_email()
|> to(user)
|> tag("export-success")
|> subject(subject)
|> render("export_success.html",
user: user,
site: site,
download_url: download_url,
expires_in: expires_in
)
end
def export_failure(user, site) do
subject =
on_ee do
"Your Plausible Analytics export has failed"
else
"Your export has failed"
end
priority_email()
|> to(user)
|> subject(subject)
|> render("export_failure.html", user: user, site: site)
end
def error_report(reported_by, trace_id, feedback) do
Map.new()
|> Map.put(:layout, nil)
|> base_email()
|> to("bugs@plausible.io")
|> put_param("ReplyTo", reported_by)
|> tag("sentry")
|> subject("Feedback to Sentry Trace #{trace_id}")
|> render("error_report_email.html", %{
reported_by: reported_by,
feedback: feedback,
trace_id: trace_id
})
end
def approaching_accept_traffic_until(notification) do
base_email()
|> to(notification.email)
|> tag("drop-traffic-warning-first")
|> subject("We'll stop counting your stats")
|> render("approaching_accept_traffic_until.html",
time: "next week",
user: %{email: notification.email, name: notification.name}
)
end
def approaching_accept_traffic_until_tomorrow(notification) do
base_email()
|> to(notification.email)
|> tag("drop-traffic-warning-final")
|> subject("A reminder that we'll stop counting your stats tomorrow")
|> render("approaching_accept_traffic_until.html",
time: "tomorrow",
user: %{email: notification.email, name: notification.name}
)
end
@doc """
Unlike the default 'base' emails, priority emails cannot be unsubscribed from. This is achieved
by sending them through a dedicated 'priority' message stream in Postmark.
"""
def priority_email(), do: priority_email(%{layout: "priority_email.html"})
def priority_email(%{layout: layout}) do
base_email(%{layout: layout})
|> put_param("MessageStream", "priority")
end
2022-11-09 17:05:42 +03:00
def base_email(), do: base_email(%{layout: "base_email.html"})
2022-11-09 17:05:42 +03:00
def base_email(%{layout: layout}) do
mailer_from = Application.get_env(:plausible, :mailer_email)
2020-05-11 14:27:20 +03:00
new_email()
|> put_param("TrackOpens", false)
|> from(mailer_from)
|> maybe_put_layout(layout)
end
defp maybe_put_layout(email, nil), do: email
defp maybe_put_layout(email, layout) do
put_html_layout(email, {PlausibleWeb.LayoutView, layout})
2020-05-11 14:27:20 +03:00
end
2019-09-02 14:29:19 +03:00
end